Five more IEE Innovation Evaluation reports published

Posted on 4 February 2020

A further five reports from projects funded by the IEE Innovation Evaluation Grants have been published on our website. These five evaluations are small-scale, and test pilot innovations in teaching and learning approaches in reading, maths, science, and retention and cognitive science, that aim to improve the attainment of pupils.

Two of the evaluations looked at innovations designed to improve reading skills and suggested positive impacts. Enhancing reading skills in young learners investigated whether reading skills of Year 1 pupils with low reading ages or poor phonological awareness could be developed with the explicit teaching of sight word recognition and phonological awareness training. The results showed an effect size for pupil progress in reading accuracy and comprehension of +0.4 for the pupils in the intervention. On average those pupils in the intervention group made 2.3 months more progress than the control group in their reading age.

In the second reading innovation evaluation, Northgate High School evaluated a structured after-school reading club and its impact on reading age. The intervention was delivered to disadvantaged pupils in Year 7, in four schools in Suffolk. The evaluation found that the intervention had a positive impact on reading age, as measured by Access Reading Test interactive (ARTi). Over the course of the evaluation, the additional mean progress in reading age in months was +8.67 when comparing the intervention group with the control group, a positive effect size of +0.34.

The evaluation of the use of learning journals during Year 6 science lessons also found positive results. The intervention group made more progress on the test between term 1 and term 6 than the control group (effect size +0.72). However, the data suggests that the intervention had limited impact on pupil premium pupils and pupils with special educational needs and disabilities, although both groups had relatively small numbers.

There were mixed results for the DoNow+ evaluation, which tested whether Year 8 pupils’ responses to analytical questions could be improved by carrying out low-stakes quizzing and responding to analytical questions in history and in English lessons. The evaluation found that it had a positive impact on pupil performance in history (effect size = +0.23) but in English the effects were negative (effect size = -0.35). The process evaluation suggested that even in English, there were positive side-effects of the innovation, particularly in terms of classroom management.

Finally, there were some interesting results for the Let’s Think Early Years evaluation, which found that intervention pupils made more progress in spatial reasoning than within-class control pupils, but intervention classes made less progress than control classes. The reasons for these findings are unclear, but might be because of  a lack of comparability across control and intervention classes.

All these are small-scale projects, sometimes carried out in only one school, so it is not possible to generalise their findings. For those carrying out innovation evaluation projects, the main benefit is often in the process, rather than the findings, and the questions it raises about the intervention and its implementation within school.

If you have an idea for an innovation in your school and want to find out whether or not it works, funding is now open for the next round of LbQ innovation evaluation projects, with grants of up to £5,000 for your school available.

This new round of Learning by Questions (LbQ) innovation evaluation projects will support you all the way with the challenge of evaluating in-school change. We know that budgets and time are tight, and finding out what works in school can be challenging, so, building on our previous experience, we’ve developed an extensive package of training and support. To find out more, please register for one of the free evaluation twilights.