Dictated by data?


Nathan Archer explores the idea that an outcome-driven, accountability culture is shaping a limited and limiting research agenda. He calls for richer, more diverse perspectives.

As a novice researcher, I have been reflecting on ‘what counts’ as research in early childhood.

In the latter part of 2018, two episodes pushed my thinking about the relationship between the early childhood education sector and some current research agendas. The first episode involved reading tweets from a policy conference which included comments from a Department for Education official: ‘there is far less evidence on what works in early years, especially compared to schools’.

The second was a conference presentation by a research school which seemed to suggest that in its research work, pupil assessment data was the primary starting point for a research question or problem. An example was given about progress towards expected outcomes in literacy and the resulting interventions and evaluation of this.

These experiences prompted me to consider, first, the notion of ‘what works’ and how this came to be such a powerful story in educational research, and second, where are the other research stories?

The idea of ‘what works’ has developed over the last two decades and in education has been promoted as a plan to better join up policy, research and practice. In the UK, this approach aims ‘to embed robust evidence at the heart of policy-making and service delivery’ (Cabinet Office, 2019). One ‘what works’ centre is the Education Endowment Foundation (EEF). The EEF website instructs us to ‘start here if you're looking for evidence-based programmes which can help boost young people's literacy, numeracy, science, and metacognitive learning.’ The foundation hosts and funds an extensive bank of research.

For me, this raises a number of questions: If policy-makers are seeking what works, I ask: what ends should it work for, who should it work for and in what contexts? And can it always ‘work’?

Nathan Archer

What ends should it work for?

Much of the research which falls under this approach is based on seeking to improve a finite set of pre-determined outcomes (primarily in literacy, numeracy and science) and is informed by questions of ‘how’ these outcomes can be achieved rather than ‘why’. This ‘howness’ is seen in the idea of, and the push to understand, causes by means of experimental research in order to secure findings. This experimental research often takes the form of randomised controlled trials (RCTs) and impact evaluation to understand causes and effectiveness of interventions. It is often seen by those who enthusiastically support evidence-based education policy making, as the preferable, or indeed, only method capable of providing secure evidence about what works.

Ultimately, this approach is not neutral, it is a particular view of professional practice as implementation. I argue that this emphasis on the ‘how’ of these specific interventions that are researched risks reducing teaching to a technical act, one which promotes the idea that if educators apply these sanctioned interventions, desirable outcomes will be achieved.

Returning to the example of the research school, while such interventions were deemed successful by the school in terms of improving outcomes, I was left pondering the limitations of only asking questions informed by existing assessment data. Are these the only questions which need researching? Are there not many hypotheses and broader questions that might be starting points for a research endeavour? Or is an outcome-driven accountability culture shaping a limited and limiting research agenda?

If research is only driven by data associated with predetermined learning outcomes, I suggest we are asking limited questions. In this way ‘what works’ might be read as shorthand for understanding processes. It seldom asks bigger questions or questions about the different contexts in which teaching takes place. So, when the questions are shaped by ideology and a particular view of what research should be, I suggest that this is not evidence-based policy making but policy-based evidence making.

I think it is important to look at the dominance of certain forms of research promoted by policy-makers. Much of this research offers important contributions to understandings of teaching and learning – but it is partial. It does not (usually) fully engage with the moral, political, relational (and multiple other dimensions) of our work with young children.

Conversely, Sanderson (2003) in ‘Is It “What Works” that Matters?’ asks not what is effective but what is appropriate. An example of this is the Early Years Toolkit on the EEF website which features a list of interventions presented with a focus on cost effectiveness and evidence strength. These indicators, which are used to rank the research, are illustrated by a number of pound signs to indicate value for money and the evidence strength is determined by the number and nature of studies included. The toolkit also includes a translation of ‘effect size’ (the extent of difference between two groups in any trial, based on one group receiving the intervention and the second not) translated to ‘months progress’ expected by a child as a result of the intervention. I see such a causal relationship as reductive and unable to capture the complexity and nuance of possible contributing factors in a child's development.

While a growth in the development of research with rigour and validity is to be welcomed, should such high profile research be limited to ‘what works?’. In other words, research in which interventions are evaluated only by cost effectiveness and ‘additional progress’ in months of a child's learning and development.

Who is the research for?

I also question who is researching? From what perspective? And with what intentions? Who is included and excluded in this process? And who is the research for?

While much of the ‘what works’ research may be generated or at least participated in by practitioner/researchers, the parameters of what is supported through this approach is limited to participants willing to focus on predetermined outcomes. Educators seeking to adopt alternative theoretical and methodological approaches are not funded equitably.

I argue that the model of banks of ‘what works’ research to be shared and acted upon also compromises teacher autonomy. Biesta (2007) suggests that this kind of research focus raises questions about who is allowed and who should be allowed to participate in decisions about practice and who decides what is educationally important and desirable. Ultimately, this approach could restrict the opportunities for participation in decision-making in policy, practice and research.

In what contexts?

Evidence-based education as a term has become synonymous with randomised control trials (RCTs), as testing new medications have become synonymous with evidence-based medicine. Questions of possible individual bias or challenges in controlling variables in this approach are seemingly overcome through the comparison of multiple studies or meta-analysis.

I argue that the ‘what works’ perspective is premised on the idea that an intervention is applicable to all and effective for all. It is an approach which assumes that if an educator applies a particular ‘given’ intervention, then the outcome is both predictable and desirable in multiple circumstances. But such a ‘one size fits all’ perspective ignores context, it disregards the local and the situated nature of educational practice. Additionally, as mentioned earlier, it assumes that fixed and given desirable education outcomes drive the research. This might be seen as what Malaguzzi calls ‘prophetic pedagogy’. He commented on the risks of an education that:

‘…contemplates everything and prophesies everything, sees everything to the point that it is capable of giving you recipes for little bits of actions, minute by minute, hour by hour, objective by objective, five minutes by five minutes. This is something so coarse, so cowardly, so humiliating of teachers' ingenuity, a complete humiliation for children's ingenuity and potential.’ (Cagliari et al, 2016: p.xvi)

If ideas of ‘what works’ come to dominate perspectives of research in early childhood education, we risk losing numerous other world views in multiple contexts which inform our research work and consequently our practice as researchers and practitioners.

Linked to this idea of context is that ‘what works’ evidence-based interventions are always generalisable and transferrable. It is research as experiment and test-bed collated to ‘generate, translate and adopt’ evidence for practice (Cabinet Office, 2019). However, I argue that such research is undertaken in particular social, historical, cultural and geographical situations. So, for example, in the case of randomised control trials we can learn about possible relationships between interventions and results in specific contexts, but not necessarily that it can be replicated successfully again. As Biesta (2007, p.16) tells us ‘research, in short, can tell us what worked but cannot tell us what works’.

So where are the other research stories?

It is important that I underline my acknowledgement of the breadth of theoretical and methodological approaches to research that are adopted in the field. There is an ever-growing wealth of rich, exciting research studies in early childhood education. I also see ever stronger links between practitioners, settings and research organisations exploring policy and practice.

But it is the authority of the ‘what works’ story which troubles me. Notwithstanding the diverse range of research that a whole host of schools, settings and higher education institutions engage in, it is the government-sanctioned (and funded) prevalence of ‘what works’ research that we should continue to question. In short, ‘what works’ is a dominant discourse. Acknowledging such dominant discourses in early childhood, Peter Moss (2018) argues the need to hear ‘alternative narratives’ or ‘other stories’.

An environment in which ‘what works’ research is elevated in status to the detriment (or exclusion) of other forms of research and other theories is an impoverished one.

‘An environment in which ‘what works’ research is elevated in status to the detriment (or exclusion) of other forms of research and other theories is an impoverished one.’

I am currently researching the professional life stories of early educators in 21st Century England. This has brought me into contact with a diverse range of dedicated early childhood educators and their rich, complex narratives as they navigate shifting professional landscapes. Notably I see and hear early educators asserting their professional agency and developing creative, relational responses to local challenges rather than applying definitive, replicable answers to seemingly technical problems.

In the face of policy developments which can seem instrumental and reductive I think it is vital that we continue to hone our critical awareness. Developing a form of ‘critical literacy’ of policy and research supports our professional confidence and professional judgement to engage with, and to critique the research. Furthermore, it empowers us to act on community concerns, forge new allegiances, strategic representation with policy makers and critical engagement with government agendas (Sumsion, 2007).

I hope that the links which we continue to make between research, policy and practice might move beyond ‘what works’ alone and offer richer, more diverse perspectives. As Bruner (2002: 103) suggests: ‘The tyranny of the single story surely led our forebears to guarantee freedom of expression…let many stories bloom’.

Key points

  • Much of the ‘what works’ research is based on seeking to improve a finite set of pre-determined outcomes (primarily in literacy, numeracy and science)
  • Developing a form of ‘critical literacy’ of policy and research supports our professional confidence and professional judgement to engage with, and to critique the research
  • An environment in which ‘what works’ research is elevated in status to the detriment of other forms of research and other theories is an impoverished one

Keep up to date with Early Years!

Sign up for our newsletter and keep up to date with Early Years education, process and events! We promise we won't spam you!