We need a more flexible and useful assessment regime that will motivate young people

Standards beyond a syllabus, assessment beyond marking, reporting beyond grades

Amelia Peterson Social Policy Fellow at the London School of Economics

In my last blog, I described why we had to change the underlying assumptions of the exam system: for the most part, it’s not a competition. The broader purpose of exams is about incentivising learning. Exams motivate students to study. As Laura McInerney has argued on Twitter, they are like the sports match that motivates you to practice.

The problem is, we all know what happens with Sport. Young people who show early promise are picked up, coached, trained and – indeed – motivated by the joy of competing and the chance of winning. Others are not, and many stop doing any kind of sports as soon as they can.

We can’t let our assessments be like that. They need to provide motivation in another way.

We know a lot about what motivates people: things like agency (the ability to make choices); relationships (working with and for others); and competence (understanding what you can do and how you can learn more). A proven means of motivating adolescents to learn is through public exhibitions. As is being able to explore and project aspects of identity. 

All of these things are very hard to do within the confines of the GCSE years. Good pedagogy and the exam system seem to be at odds. But here are three changes that could bring them closer together.

Standards beyond a syllabus

International onlookers are often surprised to discover that England has no national curriculum beyond the age of 14. After this, it is replaced by a set of exam specifications. Those specifications then get turned into a syllabus for each subject qualification from each exam board.

If you look through the GCSE subject content held by the Department for Education, most of them read really well. History, for example, despite controversy about too much of this or little of that, has as its first learning outcome that it should enable understanding of “the wide diversity of human experience”, and specifies only some breadth of time coverage and that 40% of assessed content should be British history.

 Sounds great, sign me up!

 But from this subject content we get a specification created by each of the exam boards. Suddenly the scope is narrowed. So with AQA, for example, roughly a quarter of your teaching would have to be on either Norman, Medieval, Elizabethan or Restoration England. You could also choose the option of ‘migration, empires and the people’, but then you’re covering everything from 790 onwards. If you wanted to study in-depth, for example, the enlightenment; women’s liberation; the 19th century British empire, or any other topics that might get teenagers interested in history and indeed help them understand “the wide diversity of human experience”, that would not be possible. By the time students get a say, they are faced just with “the syllabus”: a tiny handful of historical periods and places.

 History is an easy subject to pick over, but this kind of narrowing is everywhere. Where there is breadth – as in Maths – it is through a list of disconnected learning objectives: a very long syllabus, but a poor relation to a standard of what someone can do or understand.

 A starting point for better assessment would be to replace a syllabus with a short, essential set of standards that will be used when assessing work. The job of exam boards should be to help carry out examining, not to determine the de-facto curriculum.

Assessment beyond marking

In a quest for objectivity, our approach to examining in this country has been reduced to marking. Teachers are paid (too little) in the summer to “apply” a mark scheme. The idea is that the mark scheme is simple enough that there is little room for doubt. It is also so narrow and predictable that it rewards teaching to it.

Predictability is a big problem because it totally messes with assessment theory. Exam-style questions are meant to sample what you know and can do to give a picture of the whole. But if you are only learning how to answer exactly the type of questions that you know will be asked, your results are giving a distorted picture of your knowledge. (This might be why we’ve seen such divergence over the years between exam results and England’s performance on international assessments).  

Whatever replaces current GCSEs, it needs to draw on better assessment theory. Where we want to assess knowledge through bunches of questions, they should be randomly selected from a wide range of question types. And if we want to assess skills or applied knowledge through written work, products or performances, we should look to methods like comparative judgment or rubrics and moderation. Assessments should evaluate actual productions, knowledge or skills, not the ability to answer a set question format.

Reporting beyond grades

The final way we can move beyond current GCSEs is to think seriously about what needs to be communicated to external stakeholders – whether at age 16 or later. The shift from A* to G to 9 to 1 has already put paid to the idea that employers can’t cope with change (and you couldn’t come up with a much more confusing change than reversing the order of O Level grades).

Again, the design of GCSE grading is farcical because it is filtered through the exam boards.

The Department for Education subject guidance provides assessment objectives; exam boards then divvy these up amongst papers and questions; and finally a student gets a transcript which gives their results, usually on each of two papers, but no information on which of the objectives they have actually met or by how much.

The whole process of reporting therefore involves a massive loss of information about what students can actually do. All students are left with is a grade – effectively, a rank.

Any alternative to GCSEs should start with the question of what a transcript needs to communicate, and to whom. What are the key things that post-16 settings or employers try to infer from GCSE grades and how could they be communicated more accurately? Answering this question would provide a basis for a much more flexible and useful qualification structure – with more chance to reach the ultimate goal of motivating all young people to learn.

Enjoy getting into the weeds? Look out for opportunities to join groups working on assessment design.

Discuss this post