The Debunking Handbook Part 3: The Overkill Backfire Effect

The Debunking Handbook is an upcoming guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.

This post has been cross-posted at Skeptical Science

One principle that science communicators often fail to follow is making their content easy to process. That means easy to read, easy to understand and succinct. Information that is easy to process is more likely to be accepted as true.1 Merely enhancing the colour contrast of a printed font so it is easier to read, for example, can increase people’s acceptance of the truth of a statement.2

Common wisdom is that the more counter-arguments you provide, the more successful you’ll be in debunking a myth. It turns out that the opposite can be true. When it comes to refuting misinformation, less can be more. Debunks that offered three arguments, for example, are more successful in reducing the influence of misinformation, compared to debunks that offered twelve arguments which ended up reinforcing the myth.1 

The Overkill Backfire Effect occurs because processing many arguments takes more effort than just considering a few. A simple myth is more cognitively attractive than an over-complicated correction.

The solution is to keep your content lean, mean and easy to read. Making your content easy to process means using every tool available. Use simple language, short sentences, subheadings and paragraphs. Avoid dramatic language and derogatory comments that alienate people. Stick to the facts. 

End on a strong and simple message that people will remember and tweet to their friends, such as “97 out of 100 climate scientists agree that humans are causing global warning”; or “Study shows that MMR vaccines are safe.” Use graphics wherever possible to illustrate your points.

Scientists have long followed the principles of the Information Deficit Model, which suggests that people hold erroneous views because they don’t have all the information. But too much information can backfire. Adhere instead to the KISS principle: Keep It Simple, Stupid!


  1. Schwarz, N., Sanna, L., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight:Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology, 39, 127-161.
  2. Reber, R., Schwarz, N. (1999). Effects of Perceptual Fluency on Judgments of Truth, Consciousness and Cognition, 8, 338-3426.

Leave a Reply

Your email address will not be published. Required fields are marked *