Is technology really worth the investment?

  • Is technology really worth the investment?

The research questions

Setting aside the Hawthorne effect and findings summarised by Hattie in Visible Learning - which essentially demonstrate that just about any intervention (even unintended ones) will lead to a (probably) positive impact, even if the effect size is small - and thinking about the numbers from the publisher for a moment…

What does an increase on 12-19 points “on average” actually mean? (from Raise-Online)

12 capped points

One GCSE grade in two subjects for all pupils OR

One GCSE grade in all subjects for a quarter of the pupils

24 capped points

One GCSE grade in four subjects for all pupils OR

One GCSE grade in all subjects for half of the pupils

So between 12-19 capped points could mean anything from “all the pupils doing better in at least two GCSEs” or “half the pupils doing better in all their GCSEs” – or, if you play with the maths, a small number of pupils getting two GCSE grades higher in three subjects, three GCSE grades higher in two subjects - or any combination in between.

What the data does not show, is the impact of this £15,000 investment compared to other initiatives that the school could implement (Hattie again and the EEF) - it also does not show if this investment works in the context of this school, at this time.

To be clear, I’m not knocking online, e-learning tools - I’m sure they have a place, when used in combination with other, whole school initiatives. What I want to focus on is the decision making process and research that went into a school spending a non-trivial amount on a whole school initiative.

Ask for Evidence

My previous call for teachers to “ask for evidence” applies doubly here:

1. Ask the publisher for evidence (in this case, in fairness they tried to supply evidence) - and make sure you, as educationalists, run that evidence through the ‘de marketing spin’ process and ask, “What is this evidence really showing me; what is it not showing?”

2. You must find your own evidence. As has been shown, just about any intervention seems to work (at least in the short term); you must undertake your own trial before you spend the equivalent of a well qualified LSA on an intervention strategy that might not actually work for you. In this case, the trial would be easy. Benchmark a year group before, half to use the strategy for a term and half not, then benchmark them all at the end. Some real statistics (t-tests, effect size - and not just the means) and you can make an informed judgement over whether the intervention works – or not.

And that’s my point really - schools need a staff member (or an external, consultant or to work as part of a collaboration) to inform decision making on this scale. Someone who will constantly ask “Where’s the evidence?”; a critical friend - a school governor, maybe?

Teachers make excellent researchers

I have blogged before about the role that classroom based teachers have in (a) taking responsibility for being a reflective researcher and (b) in sharing their research with the widest possible audience - hence my Kickstarter to found a teacher led, peer reviewed research journal.

Find out more about The Journal of Applied Education Research, and get involved, here: http://jaer.org.uk/