Know thy impact
Subject: Editors picksView page as PDF
Share this page:
Leaders of evidence-based schools must still be sensitive to the biases they may encounter along the way, say John Hattie and Stephen Cox
Oliver Caviglioli noted that evidence-based school leadership is a ‘moral duty’. We have seen several school leaders destroy some of the best school programs because of their whims and beliefs, and seen others introduce high impact programs with resistance from many. He also considers many misconceptions, though we would add that ‘evidencebased’ should be the most hotly contested notion of all. There is evidence from research, including the interpretations teachers make from their experiences. Both need critique and verification, and to relate to the diagnosis, because solutions unrelated to problems rarely have impact. We need to be aware of our many biases, such as confirmation bias (always looking for the positive), and indeed it is optimal to ask, “What evidence would I accept if this program is not working, with whom, about what?”
What does growth look like?
Oliver is particularly concerned about the edu-myth of d=.40. This is no myth; it is the average of all 1,600 meta-analyses, that is all. I have used it as a hinge point to try and understand those above and below this effect.
When we also ask the average effect from moving from one year to the next (based on SATs, NCLB, NAPLAN, e-asTTle), it too is 0.40. But beware of the flaw of average, as it depends on how narrow or wide the outcome measure (it is easier to get a higher effect in vocabulary than creativity), the quality of the measure, the decisions you are making and so on.
This is why we have emphasised ‘know thy impact’. The effects in Visible Learning are ‘probabilities’, and we want you to opt for high probability of success interventions. However, the fidelity, dosage, adaptions and quality of the intervention of these in your school are more important. Implemented poorly, they will reduce the impact.
We want you to build up local knowledge of the ‘average’ effects and study the variability of students around your mean, which is as important. We do argue that every student, no matter where they start, deserves at least a year’s growth for a year’s input, and we need to get better at understanding what this growth looks like. We recommend three measures: effect-sizes over time; artefacts of student work over time; and student voice about their progress.
The average of .40 per year is no myth, however it can be mythically interpreted. We share this concern with Oliver.
A tough ask
When we look at evidence bases, we are looking at the interaction of at least three things:
Rather than silver bullets, this should be viewed as a set of high probability strategies that may work in your setting.
Internal evidence base
This is about establishing an objective base for understanding what already works in your context and diligently scaling it up.
Implementation evidence base
This is having a clear theory of action around improvement strategies, establishing a baseline and monitoring traction. The sink time (time from inception to impact) is much longer on complex whole school initiatives than on micro programmes, and include collective efficacy.
The tough ask for school leaders is that all three need to be worked on simultaneously to avoid over-simplification. To help clarify this, the acronym DIIE has been introduced: Diagnose, Implement, Intervene and Evaluate. Not easy when the aeroplane is in mid-air, time is tight and strategic priorities are constantly needing to be rewoven into strategy (is that another change of education minister, inspection focus, examination standard I see on the horizon?).
Effect size was introduced into the educational lexicon as a means of comparison, not an absolute. There are many ways to measure effect size – as growth, as deviation from the norm, as difference from the control group. The Education Endowment Foundation has even combined them with cost to suggest value for money. They remain a not so new lens to help us become more evidence-based in our efforts to improve the impact of schools and teachers on the life chances of children.
Case study 1 – Fielding Primary School
Our Visible Learning journey began in 2015. Regular monitoring of the quality of teaching and learning throughout the school told us that there were inconsistent teaching strategies and a lack of professional dialogue among staff and pupils.
We embarked upon a process of evidence gathering, where we sought to deeply understand our strengths and gaps.
Our key focus areas were to:
- Create a shared language of learning
- Develop a teaching structure
- Introduce ‘Impact Cycles’
- Produce shared understandings
- Increase professional dialogue and collaboration among teaching staff
Visible Learning is now firmly embedded at Fielding. Pupils talk openly and confidently about their learning, and are clear about what makes a good learner. The impact of our work was captured in our recent Ofsted inspection, where we were judged as Outstanding.
Case study 2 – Wolgarston High School
In 2014, although our school was improving its outcomes, it lacked a coherent approach to teaching and learning, and in most classrooms was being driven unwittingly by our data targets. The school at that time could have been described as an exam factory.
The key ingredients of our approach were to ensure:
- Every lesson has clear learning intentions (know, understand, be able to) and process success criteria
- That feedback is for both the teacher and the pupil, and for closing the gap between where the pupil is and where they need to be in their learning
- That assessment be designed using SOLO, and have a balance of surface and deep questions
The evidence is clear that developing a Visible Learning approach has been successful. Results have continued to improve, and we have been in the top quintile of schools for progress nationally for the last three years in succession. Our journey is ongoing, because as we continue to evaluate our practice, we see ways to improve further.
About the authors
Professor John Hattie is director of the Melbourne Educational Research Institute at the University of Melbourne and the creator of Visible Learning – a synthesis of over 800 meta-studies covering more than 80 million students.
Stephen Cox is the CEO of Osiris International, a leading independent training provider for schools and colleges that regularly hosts a series of international conference events focusing on Visible Learning, Mindsets and other areas of innovation within education
Find out more about Professor John Hattie’s work at the Visible Learning Leadership Conference events taking place in 2020 in Edinburgh (2nd March) and London (3rd March), or the upcoming Visible Learning World Conference taking place in London this November; further details can be found at osiriseducational.co.uk