Ask the expert – dealing with unexpected exam results

  • Ask the expert – dealing with unexpected exam results

Q For schools that have been experiencing gradual improvements in their results, a sudden drop in grades can come as a shock. It’s not always immediately obvious what has caused it - and references to ‘new specifications’, or ‘changes in grading scales’ aren’t terribly helpful for staff, students or parents. Do you have any advice that would be useful for a school in this position, both in terms of supporting learners through resits, and ensuring that the next cohort to face GSCEs is genuinely exam-ready?

A Before schools focus on how well-prepared for the new exams they were, it’s worth checking what else has been going on in their school: has there been turnover of staff in key departments, or has the cohort changed considerably from last year?

You can have a look at what went wrong (and right!) using your exam board’s results analysis tool. This will let you see areas where your students did well and poorly compared to other schools, to identify areas for improvement. Request the scripts of students who did well in tricky areas so you can see what they did. Online and face-to-face feedback events run by exam boards are another way of exploring what students did less well, and unpicking exemplar student answers.

As well as looking at your coverage of the specification content, look again at the assessment objectives and how they’re reflected in questions and mark schemes. The new GCSEs have much less knowledge recall; it’s all about applying and using knowledge in unfamiliar contexts, so revisit what these questions require and use your exam board’s resources and exemplar answers to ensure you’re teaching your students the skills they need.

Q A set of unexpectedly strong results, whilst undoubtedly news to be celebrated, can lead to a period of uncertainty for staff and SLTs. Suddenly, the bar has been raised, and it may not be easy to understand exactly how that has happened. For example, were teachers’ estimates of students’ capabilities simply not high enough? And if so, what could have been done to ensure a more accurate picture? Is it possible to identify exactly what worked for these young people, in order to replicate it in future years? What will happen if the next cohort fails to replicate this level of success? What strategies are you able to suggest to schools, in order to ensure continued success under these circumstances?

A Predicting outcomes is always tricky, especially in the first year of new specifications with a new grading scale. As with schools who didn’t do as well as they’d hoped, it’s worth checking that your cohort is comparable to last year’s.

Use your exam board’s results analysis tool (and, when you get it, your DfE data through Analyse School Performance, the new RAISE tool) to compare your results with similar schools. If you genuinely have performed well, you may have prepared better than you realised for the new exams.

If that’s the case, congratulations! Don’t rush to change a winning formula. Use the results analysis tool to see where you did particularly well and if there are any areas for improvement. Request scripts for high-performing students so you can see what worked.

To improve your predictions for future years, compare your cohort’s performance to your forecasts. Did you underestimate across the piece or did certain students perform better than expected? Also make sure you’re using your exam board’s Year 10 mock tests, analysers and other resources to help inform your predictions, as well as your teaching.

Dale Bassett is Head of Curriculum - Strategy, at AQA. He leads the curriculum team which is responsible for devising and delivering the curriculum strategy for each subject and providing expert subject input to AQA’s work.