The Case for Adaptive Quizzing in Assessment
The Case for Adaptive Quizzing in Assessment

The Case for Adaptive Quizzing in Assessment

At the beginning of the summer, I stumbled upon the work of Dr Svenja Heitmann. At the time, I wrote up the papers and started a dialogue with Dr Svenja about her research. It follows on from my interest in Latimier et al., (2021) and Rawson’s Successive Relearning.

“Without question, the most efficient schedule [for spaced learning] is an adaptive one, accounting for the learner’s rates of forgetting and prior knowledge”.

(Latimier et al., 2021: 980).

Dr Heitmann and her colleagues focused their attention on the adaptive (personalisation) mechanisms. Two adaption models were investigated in the laboratory: performance-based and cognitive demand-based. Both models led to performance benefits. Personalising the adaptive quizzing, using perceived cognitive demand-based adaptations “substantially increased the quizzing effect” (Heitmann et al., 2018: 10). This mechanism was then applied in the field study.

The field study results led Heitmann et al., (2021: 603) to conclude that the benefits of practice quizzing “in authentic learning contexts are even greater when the quiz questions are adapted to learners’ state of knowledge”.

What does that actually mean and why might it interest teachers?

The inference is, that in addition to knowing more, quizzing (retrieval practice) frees up cognitive capacity or thinking space. As Dr Heitmann commented:

The post-test performance was better because their mental resources weren’t as exhausted in the learning phase.

Dr. Svenja Heitmann

Students profited from the freed-up capacity for the execution of beneficial learning processes.

So, what can teachers take from Dr Heitmann’s research?

First, the benefits of quizzing over note-taking.

Second, the more a student knows going into an exam, the more cognitive capacity they will have to attend to the mechanics of that exam.

Third, perceived cognitive demand based ratings, how di†fficult pupils find questions rather than how many marks they were awarded, might be a more useful measure for personalisation and whether a pupil should relearn or revise a topic area.

What might this look like in practice?

End-of-term assessments provide excellent opportunities to explore the ‘perceived cognitive demand’ of the questions you set. First, very high ratings for perceived cognitive load would indicate that learners have not yet acquired the knowledge necessary to master a question. They might benefit more from quiz questions of lower complexity. Similarly, very low cognitive load ratings would indicate that learners have already acquired the knowledge necessary to master a question and might benefit more from quiz questions of higher complexity.

However, wisely, Heitmann et al., (2022) advise caution with any metacognitive judgement, due to the common biases and heuristics of learners’ self-assessment. A simple rating scale next to the question would suffi€ce in this case.

I would also add my position, that any assessment with metacognitive judgements (be that confidence or perceived cognitive load), with feedback, promotes metacognitive accuracy. And metacognitive accuracy brings with it a crucial academic advantage. Now, with these two pieces of data (the rating and the question outcome), there is plenty to discuss with your pupils. First the perceived cognitive load ratings and second the ddifference between the rating and the performance.

Do We Have to Use Software for Adaptive Quizzing?

There are plenty of digital platforms on the market using phrases like AI, or adaptive, personalisation. However it does not have to be so. As Dr Heitmann implored:

Adaptive quizzing could just as easily be done with different folders containing differently difficult questions… and then the students use some kind of rating scale (maybe even smiley faces for the young students) to then choose the folder their next question would be coming from. There is still a whole lot of paper pencil schooling going on out there – and adaptive quizzing is available there too.

Dr. Svenja Heitmann

I asked Dr Heitmann her professional thoughts on the benefits of needing to make the perceived cognitive demand. Are the performance gains part-memorial and part metacognitive?

It’s not all about reflecting, I think that’s more of a nice side effect (to strengthen metacognition as you wrote). The adaptation is more focused on providing students with fitting questions when you do not have the resources to sit down with every single student to adapt the difficulty yourself according to your personal assessment that’s based on your interaction with that student… because no teacher teaching in regular schools has those resources!

Dr. Svenja Heitmann

Why Has Adaptive Quizzing Stayed in the Shadows?

Dr Heitmann argues,

What’s been missing is informed teachers in classrooms, teaching with adaptive quizzing and teachers with a broader audience who can make adaptive quizzing better known. Teachers need to know that it’s a good idea to adapt questions.

Dr. Svenja Heitmann

My thanks to Dr Svenja Heitmann

Heitmann, S., Grund, A., Berthold, K., Fries, S., & Roelle, J. (2018). Testing is more desirable when it is adaptive and still desirable when compared to note-taking. Frontiers in Psychology, 9. Article 2596.
Heitmann, S., Obergassel, N., Fries, S., Grund, A., Berthold, K. and Roelle, J. (2021) Adaptive practice quizzing in a university lecture: a pre-registered field experiment. Journal of Applied Research in Memory and Cognition, 10(4), 603–620.
Heitmann, S., Grund, A., Fries, S., Berthold, K., & Roelle, J. (2022). The quizzing e… ect depends on hope of success and can be optimized by cognitive load-based adaptation. Learning and Instruction, 77, 101526.
Latimier, A., Peyre, H. and Ramus, F. (2021) A meta-analytic review of the benefit of spacing out retrieval practice episodes on retention. Educational Psychology Review, 33, 959–987.

Posted on

Leave a Reply