When research confirms our existing beliefs, people often say “I knew that” but there is obviously a huge difference between what we believe and what we know. A good example here is the idea that if juvenile offenders meet with convicted felons, they will be “scared straight”. A careful evaluation of such programmes, conducted by Petrosino and Buelher in 2004, found that this practice actually increased criminal behaviour in young people. The job of research is to separate the things that are obvious and true from those that are obvious and false.

Although it seems counterintuitive, testing students on what they have learned increases their long-term recall of the material. Moreover, most studies find that practice testing has a greater impact on long-term learning than spending that same amount of time restudying the material to learn. What is even more surprising is that this happens even if the test is not scored: the simple act of retrieving something from memory makes the memory stronger.

Of course if the test is scored, and students get feedback on their answers, there is an additional potential benefit, especially if students discover that answers they thought were correct are actually wrong.

When researchers began to investigate the potential of learning from tests, one issue that came up was whether the level of confidence a student had in an answer would affect the benefit of being corrected. Intuitively, students who were very confident in their answer might resist being corrected. However, when Brady Butterfield and Janet Metcalfe investigated this in 2001, they found the opposite. The more confident students were that their answers were correct, the bigger the impact of correction on long-term learning.

The question, then, is whether this effect would also apply in typical classrooms. A very recent paper by Colin Foster, Simon Woodhead, Craig Barton, and Alison Clark-Wilson has shown that it does.

This group of investigators took advantage of a large database of responses from students using the EEDI online platform (www.eedi.co.uk) and looked specifically at multiple-choice questions where students had provided a rating for how confident they were that their answer was correct using a five-point scale (actually they used five emojis: 😬, 😦, 😐, 🙂 and 😀).

Some predictable findings emerged. Students with higher achievement levels were more confident, as were those from more advantaged homes. Boys were, on average, more confident than girls, and younger students were more confident than older ones.

In order to investigate the hypercorrection effect, the researchers analyzed the results of 3838 students who had answered similar questions in the first and second test, and had provided a confidence rating for the question in the first test. These students had a total of 44 524 incorrect answers in the first test. Of these, 19 885 answered the matching question in the second test correctly, and 24 639 answered the matching question incorrectly. It could certainly be that students who expressed high confidence in the first test tended to be high-achieving students, and so, even without a hypercorrection effect, they could be expected to do better on the second test because their first test result could be bad luck, and they would be less likely to be unlucky a second time (what statisticians call “regression to the mean”).

To overcome this, the authors used a statistical process to control for the age, gender, socioeconomic status, and the overall achievement level of the students, as measured by the score on the first test. They found that for students who answered a question incorrectly on the first test, an increase of one step on the five-point confidence scale (e.g., from 2 to 3) was associated with a 7% greater probability of getting the parallel item on the second test correct.

Certainly, because this was not a randomized-control trial we cannot be sure that the higher confidence caused the bigger improvement from the first to the second test. However, given that the researchers controlled for age, gender, socioeconomic status, and prior achievement, and considering that the students were drawn from many different schools, the idea that high confidence errors are indeed hypercorrected does seem the most likely interpretation of these results, as Butterfield and Metcalfe suggested.

What does this mean for the classroom? Perhaps the most important implication of this study, and others that have examined the hypercorrection effect, is that asking students to review their scored tests, and asking them to look at the questions they answered incorrectly does have a benefit. Combined with the strong evidence base on the practice testing effect, it seems that regular practice testing is an important aspect of effective teaching, but one that is ignored in many countries, schools, and classrooms.

One reason for this, of course, is that students do not like tests. However, the benefit of practice testing to students is so great that I think we must find ways to rehabilitate testing. One way to do this is using what I call “zero-stakes testing”. In zero-stakes testing, students will complete a test on their own. When they have finished, they receive the correct answers, and they score their own test. They do not have to tell the teacher how they did unless they want to. All in all, practice testing is such a powerful way of consolidating learning that we cannot allow students' dislike of tests to make our teaching less effective than it might be.

References

Butterfield, B., & Metcalfe, J. (2001). Errors committed with high confidence are hypercorrected. Journal of Experimental Psychology: Learning, Memory, and Cognition, 27(6), 1491-1494.

Foster, C., Woodhead, S., Barton, C., & Clark-Wilson, A. (2022). School students' confidence when answering diagnostic questions online. Educational Studies in Mathematics. https://doi.org/10.1007/s10649-021-10084-7

Petrosino, A., Petrosino, C. T., & Buelher, J. (2004). 'Scared Straight' and other juvenile awareness programs to preventing juvenile delinquency. Oslo, Norway: Campbell Collaboration.

March 16th, 2022 ED_ON Author: Dylan Wiliam