What does it all mean?
Jonathan Haslam, Director, IEE
“We’ve learned from experience that the truth will come out. Other experimenters will repeat your experiment and find out whether you were wrong or right.” Richard Feynmann
This week sees the publication of the first reports of the IEE Innovation Evaluation Grants.
Part of the Research Schools Network project, these pilot evaluations of innovations in teaching and learning approaches support the network’s goal of improving the attainment of pupils by increasing the use of evidence-based practices.
The evaluations are small-scale, and test the kinds of innovations that schools are interested in. This is very much a “bottom-up” exercise, allowing schools to get some indicative evidence behind real-world initiatives.
The first evaluations found that
- “active banking” of vocabulary improved outcomes on reading comprehension for previously high-attaining pupils in early secondary, but we do not know what the outcome might be on previously lower-attaining pupils
- Knowledge Organisers used in an English class in Year 8 led to worse outcomes on the end-of-project test, particularly for previously lower-attaining pupils.
It is crucial not to over-claim the importance of these two results. These are small-scale projects, each carried out in one school, so it is not possible to generalise their findings.
In fact, the main benefit of these projects may be in the process, rather than the findings. The schools, and staff at the IEE, have put in a tremendous amount of work on these projects, and along the way have learned a lot.
It is possible for schools to carry out evaluations of small-scale innovations, and such evaluations do not need to cost a lot of money. One of the most important aspects of these projects was the information and challenge provided to schools to make their evaluations as robust as possible. As much as anything, this was about a change of mindset when trying to make a change in practice.
Schools are perfectly capable of developing and evaluating changes in practice. Should this be called research? Perhaps not, but not all of these kinds of investigations need to be done by external agencies. Given support, teachers and schools can do it themselves.
Running as robust an evaluation as possible can provide useful challenge when implementing change. Both evaluations provided useful food for thought, and arguably reasons for a slower, more methodical approach to implementing change within school.
What are the next steps for projects such as these? For those with disappointing results, a rethink is required, but that does not necessarily mean ditching the approach altogether. Instead, more consideration may be needed in the way that the approach is structured, introduced, and incorporated into current teaching practice. For those with successful results, a larger trial involving more schools is a potential next step – this evaluation alone is not enough to justify uncritical scale-up – replication is needed to improve confidence.
We will be publishing more reports over the coming year. If you would like to find out more about these and the project as a whole, please come along to our conference in November – details here.