#12StudiesOfXmas

Posted on 8 January 2020

Jonathan Haslam, Director, IEE

Each year we publish a seasonal round-up of our favourite studies of the year. There’s nothing systematic about the selection, just a mix of the interesting and the different. For me, reading new research as it comes out is a great way to challenge and develop your thinking. That moment when a project you thought would work doesn’t, or one that you were sceptical of comes in with a big positive result. Do you dismiss the result or let it challenge your beliefs and opinions? We all like to think it would be the latter, but as JK Galbraith said, “Faced with the choice between changing one’s mind and proving that there is no need to do so, almost everyone gets busy on the proof.

1. Little evidence of the effectiveness of CPD

Professional development is not easy. In a way that is not surprising. For example, if you’re training head teachers in order get them to change their behaviour in order to get their teachers to change their behaviour in order to get their students to change their behaviour it’s not surprising that you can’t achieve this in a few twilight sessions. This systematic review found that for some kinds of CPD there weren’t enough high-quality studies evaluating its impact. For other types of CPD, where there were enough high-quality studies, they showed that there was no positive impact. As well as this review, this year we’ve also covered several individual studies of CPD this year that, similarly, haven’t shown great results (here and here). All this isn’t to say that CPD can’t have impact, or doesn’t have potential for impact, but this year the evidence has been stacking up on the “tough to do” side.

2. Nudging proves difficult to scale up

Using text messages to encourage small changes in behaviour has been a popular focus for interventions recently. Inspired by behavioural economics, these approaches are appealing because they are inexpensive and don’t take much time to set up, so even if they only have a small impact they may be worthwhile. This large-scale study with 800,000 students in the US, though, suggests that they don’t exist in a vacuum. If found that there was no impact on outcomes, perhaps because the messages came from organisations that had no pre-existing relationship with the students, and the messages were fairly generic. If you don’t know your students and their families, and they don’t know you, maybe sending supportive text messages isn’t the place to start.

3. The impact of refugees on local children

A neat analysis looking at the impact of Haitian refugees who moved to Florida in the wake of the 2015 earthquake. It shows that they had no negative impact on the performance of local Florida students, and perhaps a slight positive one. This is reassuring evidence in the face of so much negative publicity about the impact of refugees. A caveat to these positive findings is that Florida already had an existing population of Haitian immigrants and, as one of the better funded US states, may have had systems in place to cope with the influx.

4. Results of a large randomised controlled trial of growth mindset

Growth mindset continued to collect mixed results in 2019. This large-scale US study showed a positive impact on student achievement, whereas this English trial showed no impact at all. An interesting point to reflect on, particularly coming from the US trial, is that teaching students about growth mindset doesn’t need to be particularly time-consuming. The recent US study took less than an hour. Even if that results in only a small impact, it’s probably worth having. The English trial, in a related point, suggests that basing your entire teaching approach around growth mindset is probably unnecessarily complicated, and unlikely to bring sufficient benefit for it to be worth the effort.

5. Student allocation to maths sets not always based on previous achievement

Sometimes I read a study, and its significance only dawns on me as time goes by. This study shows that around one-third of students were “misallocated” to maths sets at the start of secondary school. Moreover, black, Asian and female students were disproportionately likely to be allocated to lower sets. At the time, I wondered if the schools might “know” their students better, and whether scepticism about Key Stage 2 SATS results is justified. But looking into it further, research in 2013 showed that KS2 achievement was generally more strongly correlated with achievement at either KS3 or GCSE than either CATs or the MidYis assessments. And research in 2018 showed that KS2 SATs results are reliable. Getting an accurate assessment of a student’s achievement is difficult, and the less robust the assessment, the greater the chances of bias slipping in.

6. Reassessing concerns about school may help improve academic achievement

With the growing concern over children’s mental health, I sometimes worry that schools are being encouraged to work outside their comfort zone. Teachers are not psychiatrists, and applying a little knowledge in this area may be a dangerous thing. This study showed that a positive impact can be made using more “normal” teaching activities. Here, concerns about moving to big school were addressed by encouraging students to carry out writing exercises, reflecting on the way that this was most likely a temporary feeling, and supported by stories from older pupils of how they had coped. It seems to me that a strategy like this is much more achievable, and within current capabilities, than something more therapeutic in nature.

7. Printed vs digital text: A meta-analysis

We all have our biases – things that we would like to see confirmed by research. I always struggle to read research papers on screen. I can give them a quick skim and get the gist, but if I want to understand it more thoroughly, I tend to print it out. This meta-analysis found that, for people of all ages, reading comprehension improved with printed text, and that people were better able to judge their own level of comprehension. One drawback is that this analysis included only a small number of studies with school children. But it’s reassuring for those of us who still appreciate the printed word. A meta-analysis we covered in 2018 found similar results, although mostly in studies with undergraduates.

8. Small class size vs. evidence-based interventions

There are some issues that are considered resolved, but it is still useful to be reminded of what the research says, and check that this still applies today. For example, reducing class sizes is moderately effective, but an expensive way of getting that benefit. In France, though, it still seems to be a popular solution. In 2017, the government introduced reduced class sizes (12 students) in certain education priority zones. This prompted researchers to look at what had happened in France the last time this had been tried (in 2003). Reducing the class size did indeed have a positive impact, but only about the same amount as using an evidence-based reading intervention. In a further study, researchers also found out that any gains from reducing class size were lost in the following year if students went back to a full-size class. So, yes, reducing class sizes will have a positive impact, but it probably isn’t the best way of spending scarce funds.

9. Sure Start had positive health benefits for children in poorer neighbourhoods

There is a shortage of evidence on the impact of the Sure Start programme, which is a shame given the amount of money spent on it. It would be helpful if we had a clearer idea of what impact the spending had, in order to justify further expenditure in the future. One narrative that is emerging, for example, is that achieving universal, high-quality early childhood support is difficult. You can have high-quality, targeted support, or you can have universal support, but you can’t have both. Unfortunately, without better evaluation of the initiatives that have taken place we won’t know which is the best approach. This study found small positive health benefits, but it would be good to know more about the impact of Sure Start, given the billions that were spent on it.

10. Interleaved practice improves maths test scores

Interleaving is an effective learning strategy that is very fashionable at the moment. However, it is still useful to see studies of it being used in (almost) real-world conditions. Sometimes it concerns me if these strategies have mostly been trialled with undergraduate psychology students rather than Year 7s on a wet Wednesday. This study, carried out in maths with Year 8s in the US, provides useful evidence on the effectiveness of interleaving. One interesting finding is that the interleaving approach took longer than regular quizzing, meaning (a) it takes more time to implement, and (b) the students spent longer on the topic. One of our school-led innovation projects also provided some interesting nuance on implementing interleaving.

11.  How engaged are teachers with research?

If you’re reading this you’re probably interested in research, and practitioner interest in research has grown massively in the years that we have been publishing Best Evidence in Brief. Sometimes we might believe that this bubble is now so huge that it covers the entire teaching profession. This research sounds a note or two of caution. While research evidence was viewed positively, it played a small role in teacher decision-making. More than 80% of teacher CPD wasn’t research-based. And teachers don’t usually go to research for their ideas. Another study this year showed that CPD to help teachers and schools use research evidence is likely to be as difficult to make effective as any other kind of CPD. Proving that research engagement is worth the effort is still elusive. There may be a moral case, but we can’t yet prove that it improves outcomes for children.

12. Test anxiety intervention and uncertain control

As high-stakes exams have become more challenging, there’s growing interest in what, if anything, can be done about students’ test anxiety, and we’ve covered a few studies on the issue this year. This first one, from the UK, used cognitive and behavioural approaches to produce a moderate reduction in the worry and tension components of test anxiety. Of course test anxiety isn’t limited to the UK. One Chinese study showed how test anxiety isn’t always a bad thing, and can vary between pupils with different achievement levels, particularly when they have different “stakes” in the outcomes of the test. A second Chinese study looked at how encouraging students to carry out expressive writing tasks (on the positive emotions they had each day) lowered test anxiety.

I hope you’ve enjoyed this brief look at the research we covered in 2019. You can find these, and more than 800 other studies, tagged by keyword, in our searchable archive.

I’ll end with another quote, looking forward to what we might discover in 2020.

“1,500 years ago, everybody knew that the Earth was the center of the universe. 500 years ago, everybody knew that the Earth was flat. And 15 minutes ago, you knew that humans were alone on this planet. Imagine what you’ll know tomorrow.” Men In Black

 

A final note: The IEE has brought you Best Evidence in Brief free for the last eight years. This includes both the fortnightly e-newsletter and a searchable website. We want to continue to make it freely available to as many people as possible, with that in mind, we would like to ask for donations to support the service.

Leave a reply

Your email address will not be published.