Mostly ‘armless: The Literacy Octopus and what it takes to change school practice
Jonathan Haslam, Deputy Director, IEE
The Education Endowment Foundation today published the results of its Literacy Octopus trial, which looked at the impact of research dissemination on achievement in schools.
Arguably the implications are even wider than that, because this was, in effect, about communicating information to schools in order to achieve a change in practice. That communication might not be limited to research evidence – it could be marketing, or health information, for example, or anything that aims to encourage a difference in school behaviour.
As part of the trial, the IEE provided materials to schools (in the IEE’s case, our Better magazine, and Best Evidence in Brief e-newsletter) and evidence fairs (which provided an overview of the evidence, and introduced schools to interventions with evidence of effectiveness). We are very grateful for the support the EEF gave us in being part of this trial.
The first thing to reflect on is that when you’re actually in a trial, it’s a lot more complicated than when you’re outside, poking fun at it. The detail that goes into the planning and preparation of a trial is incredible, and you have to think carefully through the whole process to understand your role in it, and how you can give “your” intervention the best chance of success. It has made me more reticent about criticising other trials for being less than perfect. Despite their robust methodology, trials are ultimately run by humans, and have humans as their participants, and we all have our failings and foibles.
On reflection, I think it was a big leap for the schools recruited in the trial to go from being generally interested in improving literacy by using research evidence to choosing and using one of the interventions with evidence that we presented them with. I don’t think that we provided enough communication, support, challenge, etc, to give them the confidence to make that commitment (leaving aside the question as to whether that intervention was even the right one for that school). On the other hand, this trial was intentionally testing light-touch approaches, so I suppose that if we provided all those layers of support we would have been testing something rather different. It’s complicated.
Given my background, I was very interested in the outcomes of the Literacy Octopus Dissemination Trial. This looked at the impact of sending evidence-based material (in our case, our Better magazine, and Best Evidence in Brief e-newsletter) to a range of different schools. From a previous study, there was evidence that even this low-key dissemination could have an impact on schools. And if it had even a small impact, because it was low cost, the impact would be worth it. (Studies looking at the use of text messaging http://www.beib.org.uk/2016/07/text-messages-add-up-to-improvement/ and http://www.beib.org.uk/2014/12/texting-parents-helped-with-early-literacy/ are another example of a low-cost initiative that has a small impact, but is still worth it.)
In this latest trial, providing the materials showed no consistent positive impact (for any of the approaches). It was interesting that (although this evidence is not robust) IEE materials did better on the secondary outcomes (a positive disposition to academic research in informing teaching practice, and uses of academic research to inform selection of teaching approaches) than on turning this into practice (ie, getting schools to use an intervention and for this to have an effect on student outcomes). This is what I would expect. I think the IEE helps to make research accessible and understandable, but we are not the experts on enacting that research in the classroom. Schools have the expertise in this area, hence our involvement in, and support for, the Research Schools Network.
It was also interesting (although not part of the report) to see the difference in engagement between the different groups. Open rates (the percentage of recipients who open an email) are one way of measuring whether emails are being read by (and therefore proving interesting to) their recipients. Here’s how they differed:
- Literacy Octopus Dissemination Trial (sent the email “cold”) 6%
- Evidence-based Literacy Support Control (signed up for the trial, but only received materials) 18%
- Evidence-based Literacy Support Intervention (those invited to the evidence fairs) 20%
- Best Evidence in Brief subscribers (our usual mailing list) 30%
So there is a clear difference in the levels of interest and engagement across these various groups.
It takes time
There is always a delay between participating in a trial and learning the results, and in the intervening time no one sits still. We have learnt so much, both through the trial, and since, about how to get research evidence used in practice. The report points out:
Providers noted how teachers were reassured by the authority of evidence-based strategies but felt that most teachers would be more concerned with understanding the strategy itself and how to apply it than with interrogating the evidence-base.
This is so true. I think we can now see that there is more than one pathway to getting research evidence used in schools. Some will be interested in the research itself, digging into it, understanding it and then using it. But for others, simply knowing what has been found to work, and therefore has the best chance of working for them, is enough. Teachers, naturally, want to get on with it, and do the best for their pupils. So, for example, the EEF guidance reports and Research Schools developing high-quality CPD on the back of them are ways of getting straight to the strategies. My caveat would be that we should still persevere, and encourage all schools to be questioning about the evidence. Marketing may reassure us with “Yeah, yeah, there’s evidence behind it.” but this always needs to be questioned.
I think we also need to be clearer on the benefits of engaging with research. Simply knowing what the latest evidence is will not be enough to improve outcomes for Year 10. I hope that we have not over-promised this in the past, but knowledge of research, while useful, is nowhere near enough to change practice. Changing habits is hard. Supportive leadership, effective CPD… there are a whole series of things that need to be done well for this to work.
In the light of this trial, if you are interested in evidence-informed practice it may be worth considering a few questions before delving in too far:
- What do you want to achieve by engaging with evidence?
- Are you interested in the research evidence in its own right or do you simply want strategies that have been shown to work?
- What evidence is there (either internal or external) for the impact of the approaches that you are currently using?
- How do you filter all the communications you read and share? Are they having the impact you hope? How do you know?
- What are the key changes that you want to make? How will you know when you have made them?