Looking to medicine for the cure for our ailing assessment system

What we can learn from multi dimensional assessment in Higher Education

The deficiencies of our current education assessment system, shown clearly in the way that performance is measured in the majority of GCSE and A level subjects, have been well documented:

  • A primary focus on the acquisition, memorising and recall of information, rather than on the deep understanding and application of knowledge
  • Grading by means of norm referencing, establishing a failure rate of roughly a third of all candidates at the outset of their course
  • A concentration on traditional specialist academic procedures that, by implication, devalues the many other skills and competences that children need to develop – both for employment and to enjoy full and rewarding lives
  • High stress and high stakes testing, where final grades are determined by performance in a few exams on as few as two days, rather than a cumulative performance pattern over time
  • A focus solely on the individual’s solo achievement rather than their accomplishment as part of a group and their ability to interact productively in a team
  • A reliance on assessment by computer or examiner unknown to the candidate, rather than a tutor who has a knowledge of the student’s strengths and weaknesses and their holistic performance or a professional or employer
  • Assessment of learning in abstract (in the exam hall) rather than application of learning in context (in the field, through interaction with people)
  • A paucity of candidate choice regarding elements of the syllabus to be studied: whilst the exam boards may provide some degree of optionality to centres, it’s rare for these to be passed on to students.

Yet when students leave school and head into higher education, they often embark on courses which have a far more nuanced and holistic approach to evidencing progress and accomplishments.

My niece recently graduated as a young doctor from the University of Exeter Medical School, after a five year degree course. Her cohort were assessed in four main domains of skill, knowledge and character using a diverse and imaginative range of methods.

One. The Exeter medics are firstly assessed for medical knowledge, primarily using a series of Applied Medical Knowledge (AMK) tests. These are online exams, lasting 3 hours and 15 minutes each; intelligently, they have an in-built 15 minutes of resting time. The AMKs, as the name implies, are designed to test applied medical knowledge rather than simple factual recall, with clinical vignettes for the student to consider. So, the student is being asked to demonstrate knowledge within content rather than give facts for facts’ sake. The knowledge level is set at the required knowledge for a newly qualified doctor.

Importantly, the AMKs are a longitudinal test of the growth of the student’s medical knowledge across the whole medical programme. Medics sit the test four times a year in Years 2-5 (with two formative practice tests in Year 1). Because the AMK test is a cumulative assessment, grades on individual tests are less important than the overall pattern of attainment, allowing for an off day and recognising the importance of assessing holistic performance. Grades on individual tests are aggregated into a current cumulative grade. In the early years the grades are norm referenced (which allows any struggling students to be identified and supported), but by year five criterion referencing is used to determine final grades, allowing for everyone to pass should they meet the required standard.

Two. The second area of assessment relates to clinical practice. This involves students being assessed, in multiple ways, on their performance in clinical scenarios. One method is Objective Structured Clinical Examinations (OSCEs). OSCEs involve the student visiting and performing at practical ‘stations’ set up to simulate clinical interactions. Student performance is observed and assessed at each station by trained assessors who may be course tutors and are often clinical professionals and doctors. Students in Years 1and 2 undertake an OSCE each term, with a fourth optional OSCE offered towards the end of the year for any student requiring it to make up for absence or poor performance earlier in the year.

Other activities to evidence clinical practice are conducted ‘on the job’, by the doctors supervising the student medics during placements. They evidence various perspectives on students’ interactions with patients and are comprised of Mini Clinical Evaluation Exercises (assessments of the clinical and professional interaction observed between a student and a patient based on a ward or in clinic or surgery), Case – Based Discussions (at which a student presents verbally on a patient case or condition seen that week in the clinical environment), Structured Clinical Sessions (which demonstrate each student’s ability to clerk, present and lead a patient-based discussion) and Direct Observation of Practical Skills (involving the observation and assessment of the student’s performance of practical procedures in the clinical environment with real patients).

Three. The third domain is Special Study Units (SSUs). This is where a degree of optionality is introduced, allowing each medic to pursue their particular interests and play to their strengths. In each of Years 1-4, students pick from a range of modules in each of four areas (Biomedical Sciences, Healthcare, Global and Planetary Health and Healthcare in Practice) and explore a speciality or subject, guided by a tutor and in a group of 3-5 other medics. Findings can be presented in essay, oral or poster format. SSUs are assessed by the tutors who teach the 3 week modules. They evaluate the quality of the student’s engagement with the project and their group members as well as the product of the SSU.

Four. Perhaps most interesting of all, the fourth domain relates to professional development. Each time the student comes into contact with a member of the faculty or the profession, in a small group scenario or on a placement, as part of their longitudinal teaching, in clinical practice, or in an OSCE, their behaviour and professionalism is assessed. Every interaction is observed and a judgement about professionalism made.

Five. Finally, each medic has an eportfolio, in which all of their test scores, clinical practice results, SSU grades and professionalism judgements over the duration of the degree course are recorded. Each student makes a personal contribution to the eportfolio in the form of regular written reflections on their practice (maybe a case they have seen that was particularly distressing or an ethically challenging scenario they encountered), so demonstrating their ‘softer’ skills of reasoning, reflecting and learning from mistakes.

It fills me with confidence and optimism that, in these challenging times for the NHS, young doctors like Molly are graduating from such forward–thinking medical schools where they have had knowledge, skills and character fostered, evidenced and celebrated and been:

  • invited to articulate as well as write about their learning
    assessed practically as well as academically
  • required to demonstrate application of knowledge
  • assessed on their ability to work productively in a team
  • assessed on performance over time rather than in snapshots of achievement
  • protected from the unnecessary stress of high stakes testing in what is a very pressured field of work
  • assessed in the environment where they will be applying their knowledge in the future
  • supported to achieve, rather than tripped up and sorted into successes and failures
  • provided with an appropriate degree of choice and autonomy about aspects of specialism and ways of evidencing strengths.

It was thanks to this cumulative and longitudinal assessment system that student medics like Molly were able to graduate a few months early in the spring of 2020 to meet the acute demand for doctors presented by the coronavirus pandemic. Compare that with the confusion and uncertainty caused by the decision to abandon terminal exams in schools in 2020 and 2021 without a robust means of evidencing learners’ achievements and competencies.

How much more intelligent would it be to have an assessment system like this in schools, whereby each learner has an eportfolio or scorecard on which to record their multiple strengths and achievements over time, as evidenced by anonymous examiners, personal tutors, professionals in the field and themselves? If it can be achieved in higher education, it can be achieved in our schools too.

Further blogs

Any new assessment system needs to be rigorous, empowering and creative

We must ensure that children from all backgrounds achieve their highest creative potential

We can’t say we weren’t warned. The champion of a liberal arts curriculum, Richard Livingstone, way back in 1941, advised: ‘The test of successful education is not the amount of knowledge that a pupil takes away from…

What’s wrong with high-stakes GCSE exams?

Let’s call time on an outmoded, unfair, unnecessary interruption to schooling

We need an assessment system that evaluates progress across a broad curriculum from primary school onwards. This is not about ‘dumbing down’….

Gary Neville and Ben Fogle join the assessment debate

This year Gary Neville added his voice to those questioning the current assessment system, arguing that it is prehistoric and needs updating. …