Category Archives: Psychology

Restoring Recurrent Fury

A peer-reviewed article appeared in print today in an open-access journal that is likely to stimulate some interest and debate:

Lewandowsky, S., Cook, J., Oberauer, K., Brophy, S., Lloyd, E. A., & Marriott, M. (2015). Recurrent fury: Conspiratorial discourse in the blogosphere triggered by research on the role of conspiracist ideation in climate denial. Journal of Social and Political Psychology, 3 (1). doi: 10.5964/jspp.v3i1.443.

The article fits within a fairly large and growing body of evidence that suggests that the rejection of well-established scientific facts, such as the safety and efficacy of vaccinations or the fact that HIV causes AIDS or the fact that CO2 emissions alter our global climate, is often accompanied by conspiracist ideation—that is, the idea that scientists or the government are colluding to create a “hoax”. The “hoax” may involve the link between HIV and AIDS or between smoking and lung cancer or between CO2 emissions and climate change: Denial of scientific propositions involves the same playbook and the same motivated cognition, irrespective of which scientific fact is being targeted.

Our new article reports 3 studies that examined the discourse in the climate-“skeptic” blogosphere in response to an earlier publication in Psychological Science by Lewandowsky, Oberauer, and Gignac (often known as LOG12) which reported a small but significant (and replicable) association between the endorsement of various conspiracy theories and the rejection of climate science. 

Continue reading Restoring Recurrent Fury

Recursive Fury: Facts and misrepresentations

Our paper Recursive fury: conspiracist ideation in the blogosphere in response to research on conspiracist ideation has been published. The paper analyzed the public discourse in response to an earlier article by Lewandowsky, Oberauer, and Gignac (LOG12 for short from here on), which has led to some discussion on this blog earlier.

Refreshingly, the journal Frontiers makes all papers available for free with no paywall. Another unique feature of this journal is that readers can post comments directly beneath the abstract. Unfortunately this has led to the posting of a number of misrepresentations of the paper.

In this post, I’ll be addressing some of these misconceptions (but being careful to practise what I preach, will adopt the principles of the Debunking Handbook when I debunk the misconceptions). So here are some key facts about the Recursive Fury paper:

Conspiracy theorists are those who display the characteristics of conspiracy ideation

Yep, just stating the obvious, right? Recursive Fury establishes, from the peer-reviewed literature, the traits of conspiracist ideation, which is the technical term for a cognitive style commonly known as “conspiratorial thinking”. Our paper featured 6 criteria for conspiratorial thinking:

  1. Nefarious Intent: Assuming that the presumed conspirators have nefarious intentions. For example, if person X assumes that blogger Y colluded with the New York Times to publish a paper damaging to X, then X presumes nefarious intent on the part of Y.
  2. Persecuted Victim: Self-identifying as the victim of an organised persecution.
  3. Nihilistic Skepticism: Refusing to believe anything that doesn’t fit into the conspiracy theory. Note that “conspiracy theory” here is a fairly broad term and need not involve a global conspiracy (e.g., that NASA faked the moon landing) but can refer to small-scale events and hypotheses.
  4. Nothing occurs by Accident: Weaving any small random event into the conspiracy narrative.
  5. Something Must be Wrong: Switching liberally between different, even contradictory conspiracy theories that have in common only the presumption that there is something wrong in the official account by the alleged conspirators. Thus, people may simultaneously believe that Princess Diana faked her own death and that she was assassinated by MI5.
  6. Self-Sealing reasoning: Interpreting any evidence against the conspiracy as evidence for the conspiracy. For example, when climate scientists are exonerated of any wrong-doing 9 times over by different investigations, this is reinterpreted to imply that the climate-change conspiracy involves not just the world’s climate scientists but also the investigating bodies and associated governments.

Continue reading Recursive Fury: Facts and misrepresentations

ABC documentary demonstrates the how and why of climate denial

This is a repost from Skeptical Science. Note that STW’s own Stephan Lewandowsky has also published on this topic in The Age/Brisbane Times and Australian Media Centre/ABC Environment.

Tonight, the Australian TV channel ABC will air the documentary I Can Change Your Mind about Climate. The show features climate activist Anna Rose and retired Liberal senator Nick Minchin attempting to change each others’ minds about climate change, by introducing each other to a number of leading voices on climate change. Some have argued, with a fair amount of justification, that its unwise to give the small minority of those who reject climate science an equal voice with the overwhelming majority of climate scientists who agree that humans are causing global warming. However, in an article published today at ABC Environment, I argue (in a valiant effort to take a glass-half-full approach) that this documentary can instruct us on the how and why of climate denial. Here’s an excerpt:

What do you do if all the world’s experts disagree with you? A decades old technique perfected by the tobacco industry is to manufacture the appearance of a continued debate through fake experts. Climate change is a complicated, multi-disciplinary science and yet many of the leading voices who purport to know better than the experts have never published a single piece of climate research. Of Nick’s four voices against the scientific consensus, we have a blogger, an engineer and a political lobbyist. Nick turns to only one actual climate scientist, whose research has been thoroughly refuted in the peer-reviewed literature.

Alternatively, when the world’s leading experts disagree with you, another popular approach is to don a tinfoil hat and cry conspiracy. A small minority seem to believe that tens of thousands of climate scientists across the globe are all engaged in a conspiracy. Of course there is no evidence for this (which to the paranoid is further proof of a conspiracy), even when scientists’ emails are stolen and pored over with a fine-tooth comb. Fortunately very little tin-foil-hattery is in evidence throughout the documentary, apart from a throwaway line from David Evans that scientists are ‘concealing the evidence’.

How does one deny the consensus of evidence? One straightforward approach is to simply ignore it! Jo Nova ignores satellite observations that directly measure an increased greenhouse effect when she claims the warming effect from carbon dioxide (CO2) is immeasurable. Richard Lindzen claims negative feedbacks will cancel out CO2 warming, citing the Earth’s past. But it’s precisely the Earth’s past that provides many independent lines of evidence for reinforcing feedbacks that are an integral part of our climate system.

Marc Morano delivers a breathtaking torrent of misinformation (although I’m not sure he did take a breath) that ignores entire swathes of evidence. He overlooks the fact that Arctic sea ice has dramatically thinned with the total amount of ice hitting record low levels in 2011. He ignores that global warming is still happening, with our planet currently absorbing heat at a rate of two Hiroshima bombs per second. Genuine scepticism requires considering the full body of evidence in order to properly understand what’s happening to our climate. What we witness from Nick Minchin’s witnesses is not genuine scepticism but rejection of any inconvenient evidence.

Read full article…

Observing the misinformation of David Evans, Jo Nova, Marc Morano and Richard Lindzen is an examination into the how of climate denial, exposing the techniques common to all movements that deny a scientific consensus. To explain the why, I leave it to Naomi Oreskes who deconstructs Nick Minchin’s rejection of climate science in some powerful footage that tragically didn’t make it into the final cut. Thanks to the magic of the interweb, here it is in all it’s YouTube glory (many thanks to the producers for granting permission for me to upload the video):

The documentary airs on Thursday 8.30pm Australian EST. During the documentary and following Q&A panel (featuring Anna and Nick), there’ll be a live blog featuring scientists across Australia commenting on the documentary in real time. A number of scientists will also be tweeting at the same time with the hashtag #qandascientists (I’ll be tweeting away from @skepticscience and anticipate posting short URLs to SkS rebuttals as the inevitable climate myths appear).

Debunking Handbook: update and feedback

When we published the Debunking Handbook, I have to admit, we completely underestimated the impact it would make. A few days after the launch, it suddenly went viral with over 150,000 downloads in a single day. This week, it just ticked over 400,000 downloads. We always planned that the Handbook would be useful not just for climate myths but for communicators having to deal with any type of misinformation. Nevertheless, it was surprising to see the Handbook featued on websites as diverse asRichard Dawkins and Silobreaker. A website devoted to debunking MLM myths saw it as “useful when debating with brainwashed members of MLM organizations”. A Muslim forum speculated that it “Should be useful when engaging people who believe lies about Islam”. Currently, several educators are looking to integrate it into their curriculum.

Here are some excerpts from reviews of the Debunking Handbook:

“I simply cannot believe that John Cook of Skeptical Science and psychologist Stephan Lewandowsky managed, in just 8 pages, to create something as magnificent as their new Debunking Handbook… As someone who teaches science communication, I’m going to recommend Cook’s and Lewandowsky’s handbook to as many folks as I can find.”
Chris Mooney, Desmogblog

“…a must-read summary of the scientific literature on how to extract pernicious myths from people’s minds and restore fact-based knowledge.”
Brad Johnson, Think Progress

“I have to say that Cook and Lewandowsky have done a great job of clearly and succinctly outlining the challenge(s) and providing actionable paths forward to deal with them… In short, consider the Debunking Handbook a must read and a must keep reference.”
A Siegel, Daily Kos

There were a few criticisms also. A science communicator from the University of Western Australia (a colleague of Stephan Lewandowsky) pointed out that in our case study debunking on Page 6, we weren’t practising what we preach by using a graphic that emphasised the myth rather than the core facts. Fair point. So we’ve updated our example debunking and also made a minor tweak to the text on The Overkill Backfire Effect. Click here to download the updated Debunking Handbook.

This blog post was cross-posted from Skeptical Science.

The Debunking Handbook: now freely available for download

The Debunking Handbook, a guide to debunking misinformation, is now freely available to download. Although there is a great deal of psychological research on misinformation, there’s no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of myths. The Debunking Handbook boils the research down into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation.

The Handbook explores the surprising fact that debunking myths can sometimes reinforce the myth in peoples’ minds. Communicators need to be aware of the various backfire effects and how to avoid them, such as:

It also looks at a key element to successful debunking: providing an alternative explanation. The Handbook is designed to be useful to all communicators who have to deal with misinformation (eg – not just climate myths).

The Authors:

John Cook is the Climate Change Communication Fellow for the Global Change Institute at the University of Queensland. He created and runs Skeptical Science and co-authored the bookClimate Change Denial: Heads in the Sand with environmental scientist Haydn Washington. In 2011, Skeptical Science won the Australian Museum Eureka Prize for the Advancement of Climate Change Knowledge.

Professor Lewandowsky is an Australian Professorial Fellow and a cognitive scientist at the University of Western Australia. He received a Discovery Outstanding Researcher Award from the Australian Research Council in 2011. His research examines people’s memory, decision making, and knowledge structures, with a particular emphasis on how people update information in memory. He has published over 120 scholarly articles, chapters, and books, including numerous papers on how people respond to misinformation. (See www.cogsciwa.com for a complete list of scientific publications.) Professor Lewandowsky is an award-winning teacher and was Associate Editor of the Journal of Experimental  Psychology: Learning, Memory, and Cognition from 2006-2008. His research has been funded continuously since 1990 by public agencies in 5 countries, but he has no commercial interests of any kind. He has also contributed numerous opinion pieces to the global media on issues related to climate change “skepticism” and the coverage of science in the media. A complete list of his public essays can be found athttp://www.shapingtomorrowsworld.org/inthemedia.htm, which is a blog run by academics from W.A.’s three major universities.

This post has been cross-posted at Skeptical Science

The Debunking Handbook Part 5: Filling the gap with an alternative explanation

The Debunking Handbook is an upcoming guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.

This post has been cross-posted at Skeptical Science

Assuming you successfully negotiate the various backfire effects, what is the most effective way to debunk a myth? The challenge is that once misinformation gets into a person’s mind, it’s very difficult to remove. This is the case even when people remember and accept a correction.

This was demonstrated in an experiment in which people read a fictitious account of a warehouse fire.1,2,3 Mention was made of paint and gas cans along with explosions. Later in the story, it was clarified that paint and cans were not present at the fire. Even when people remembered and accepted this correction, they still cited the paint or cans when asked questions about the fire. When asked, “Why do you think there was so much smoke?”, people routinely invoked the oil paint despite having just acknowledged it as not being present. 

When people hear misinformation, they build a mental model, with the myth providing an explanation. When the myth is debunked, a gap is left in their mental model. To deal with this dilemma, people prefer an incorrect model over an incomplete model. In the absence of a better explanation, they opt for the wrong explanation.4

In the warehouse fire experiment, when an alternative explanation involving lighter fluid and accelerant was provided, people were less likely to cite the paint and gas cans when queried about the fire. The most effective way to reduce the effect of misinformation is to provide an alternative explanation for the events covered by the misinformation. 

This strategy is illustrated particularly clearly in fictional murder trials. Accusing an alternative suspect greatly reduced the number of guilty verdicts from participants who acted as jurors, compared to defences that merely explained why the defendant wasn’t guilty.5

For the alternative to be accepted, it must be plausible and explain all observed features of the event.6,1 When you debunk a myth, you create a gap in the person’s mind. To be effective, your debunking must fill that gap.

One gap that may require filling is explaining why the myth is wrong. This can be achieved by exposing the rhetorical techniques used to misinform. A handy reference of techniques common to many movements that deny a scientific consensus is found in Denialism: what is it and how should scientists respond?7 The techniques include cherry picking, conspiracy theories and fake experts.

Another alternative narrative might be to explain why the misinformer promoted the myth. Arousing suspicion of the source of misinformation has been shown to further reduce the influence of misinformation.8,9

Another key element to effective rebuttal is using an explicit warning (“watch out, you might be misled”) before mentioning the myth. Experimentation with different rebuttal structures found the most effective combination included an alternative explanation and an explicit warning.4 

Graphics are also an important part of the debunker’s toolbox and are significantly more effective than text in reducing misconceptions. When people read a refutation that conflicts with their beliefs, they seize on ambiguities to construct an alternative interpretation. Graphics provide more clarity and less opportunity for misinterpretation. When self-identified Republicans were surveyed about their global warming beliefs, a significantly greater number accepted global warming when shown a graph of temperature trends compared to those who were given a written description.10 

Another survey found that when shown data points representing surface temperature, people correctly judged a warming trend irrespective of their views towards global warming.11 If your content can be expressed visually, always opt for a graphic in your debunking.

References

  1. Seifert, C. M. (2002). The continued influence of misinformation in memory: What makes a correction effective?The Psychology of Learning and Motivation, 41, 265-292.
  2. Wilkes, A. L.; Leatherbarrow, M. (1988). Editing episodic memory following the identification of error, The Quarterly Journal of Experimental Psychology A: Human Experimental Psychology, 40A, 361-387. 
  3. Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When discredited information in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20 (6), 1420-1436.
  4. Ecker, U. K., Lewandowsky, S., & Tang, D. T. (2011). Explicit warnings reduce but do not eliminate the continued influence of misinformation. Memory & Cognition, 38, 1087-1100.
  5. Tenney, E. R., Cleary, H. M., & Spellman, B. A. (2009). Unpacking the doubt in “Beyond a reasonable doubt:” Plausible alternative stories increase not guilty verdicts. Basic and Applied Social Psychology, 31, 1-8.
  6. Rapp, D. N., & Kendeou, P. (2007). Revising what readers know: Updating text representations during narrative comprehension. Memory & Cognition, 35, 2019-2032.
  7. Diethelm, P., & McKee, M. (2009). Denialism: what is it and how should scientists respond? European Journal of Public Health, 19, 2-4.
  8. Lewandowsky, S., Stritzke, W. G., Oberauer, K., & Morales, M. (2005). Memory for fact, fiction and misinformation: The Iraq War 2003. Psychological Science, 16, 190-195.
  9. Lewandowsky, S., & Stritzke, W. G. K., Oberauer, K., & Morales, M. (2009). Misinformation and the ‘War on Terror’: When memory turns fiction into fact. In W. G. K. Stritzke, S. Lewandowsky, D. Denemark, J. Clare, & F. Morgan (Eds.), Terrorism and torture: An interdisciplinary perspective (pp. 179-203). Cambridge, UK: Cambridge University Press. 
  10. Nyhan, B., & Reifler, J. (2011). Opening the Political Mind? The effects of self-affirmation and graphical information on factual misperceptions. In press.
  11. Lewandowsky, S. (2011). Popular consensus: Climate change set to continue. Psychological Science, 22, 460-463.

The Debunking Handbook Part 3: The Overkill Backfire Effect

The Debunking Handbook is an upcoming guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.

This post has been cross-posted at Skeptical Science

One principle that science communicators often fail to follow is making their content easy to process. That means easy to read, easy to understand and succinct. Information that is easy to process is more likely to be accepted as true.1 Merely enhancing the colour contrast of a printed font so it is easier to read, for example, can increase people’s acceptance of the truth of a statement.2

Common wisdom is that the more counter-arguments you provide, the more successful you’ll be in debunking a myth. It turns out that the opposite can be true. When it comes to refuting misinformation, less can be more. Debunks that offered three arguments, for example, are more successful in reducing the influence of misinformation, compared to debunks that offered twelve arguments which ended up reinforcing the myth.1 

The Overkill Backfire Effect occurs because processing many arguments takes more effort than just considering a few. A simple myth is more cognitively attractive than an over-complicated correction.

The solution is to keep your content lean, mean and easy to read. Making your content easy to process means using every tool available. Use simple language, short sentences, subheadings and paragraphs. Avoid dramatic language and derogatory comments that alienate people. Stick to the facts. 

End on a strong and simple message that people will remember and tweet to their friends, such as “97 out of 100 climate scientists agree that humans are causing global warning”; or “Study shows that MMR vaccines are safe.” Use graphics wherever possible to illustrate your points.

Scientists have long followed the principles of the Information Deficit Model, which suggests that people hold erroneous views because they don’t have all the information. But too much information can backfire. Adhere instead to the KISS principle: Keep It Simple, Stupid!

References

  1. Schwarz, N., Sanna, L., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight:Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology, 39, 127-161.
  2. Reber, R., Schwarz, N. (1999). Effects of Perceptual Fluency on Judgments of Truth, Consciousness and Cognition, 8, 338-3426.

The Debunking Handbook Part 2: The Familiarity Backfire Effect

The Debunking Handbook is an upcoming guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.

This post has been cross-posted at Skeptical Science

To debunk a myth, you often have to mention it – otherwise, how will people know what you’re talking about? However, this makes people more familiar with the myth and hence more likely to accept it as true. Does this mean debunking a myth might actually reinforce it in people’s minds?

To test for this backfire effect, people were shown a flyer that debunked common myths about flu vaccines.1 Afterwards, they were asked to separate the myths from the facts. When asked immediately after reading the flyer, people successfully identified the myths. However, when queried 30 minutes after reading the flyer, some people actually scored worse after reading the flyer. The debunking reinforced the myths.

Hence the backfire effect is real. The driving force is the fact that familiarity increases the chances of accepting information as true. Immediately after reading the flyer, people remembered the details that debunked the myth and successfully identified the myths. As time passed, however,  the memory of the details faded and all people remembered was the myth without the “tag” that identified it as false. This effect is particularly strong in older adults because their memories are more vulnerable to forgetting of details.

How does one avoid causing the Familiarity Backfire Effect? Ideally, avoid mentioning the myth altogether while correcting it. When seeking to counter misinformation, the best approach is to focus on the facts you wish to communicate. 

Not mentioning the myth is sometimes not a practical option. In this case, the emphasis of the debunking should be on the facts. The often-seen technique of headlining your debunking with the myth in big, bold letters is the last thing you want to do. Instead, communicate your core fact in the headline. Your debunking should begin with emphasis on the facts, not the myth. Your goal is to increase people’s familiarity with the facts.

References

  1. Skurnik, I., Yoon, C., Park, D., & Schwarz, N. (2005). How warnings about false claims become recommendations.Journal of Consumer Research, 31, 713-724.

The Debunking Handbook Part 1: The first myth about debunking

The Debunking Handbook is an upcoming guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.

This has been cross-posted at Skeptical Science.

Introduction

Debunking myths is problematic. Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct. To avoid these “backfire effects”, an effective debunking requires three major elements. First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar. Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false. Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation.

Debunking the first myth about debunking

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved.1,2  Reducing the influence of misinformation is a difficult and complex challenge.

A common misconception about myths is the notion that removing its influence is as simple as packing more information into people’s heads. This approach assumes that public misperceptions are due to a lack of knowledge and that the solution is more information – in science communication, it’s known as  the “information deficit model”. But that model is wrong: people don’t process information as simply as a hard drive downloading data. 

Refuting misinformation involves dealing with complex cognitive processes. To successfully impart knowledge, communicators need to understand how people process information, how they modify their existing knowledge and how worldviews affect their ability to think rationally. It’s not just what people think that matters, but how they think.

First, let’s be clear about what we mean by the label “misinformation” – we use it to refer to any information that people have acquired that turns out to be incorrect, irrespective of why and how that information was acquired in the first place. We are concerned with the cognitive processes that govern how people process corrections to information they have already acquired – if you find out that something you believe is wrong, how do you update your knowledge and memory?

Once people receive misinformation, it’s quite difficult to remove its influence. This was demonstrated in a 1994 experiment where people were exposed to misinformation about a fictitious warehouse fire, then given a correction clarifying the parts of the story that were incorrect.3 Despite remembering and accepting the correction, people still showed a lingering effect, referring to the misinformation when answering questions about the story. 

Is it possible to completely eliminate the influence of misinformation? The evidence indicates that no matter how vigorously and repeatedly we correct the misinformation, for example by repeating the correction over and over again, the influence remains detectable.4 The old saying got it right – mud sticks.

There is also an added complication. Not only is misinformation difficult to remove, debunking a myth can actually strengthen it in people’s minds. Several different “backfire effects” have been observed, arising from making myths more familiar,5,6 from providing too many arguments,7 or from providing evidence that threatens one’s worldview.8

The last thing you want to do when debunking misinformation is blunder in and make matters worse. So this handbook has a specific focus – providing practical tips to effectively debunk misinformation and avoid the various backfire effects. To achieve this, an understanding of the relevant cognitive processes is necessary. We explain some of the interesting psychological research in this area and finish with an example of an effective rebuttal of a common myth.

References

  1. Jacques, P. J., & Dunlap, R. E. (2008). The organisation of denial: Conservative think tanks and environmental skepticism. Environmental Politics, 17, 349-385.
  2. Oreskes, N., & Conway, E. M. (2010). Merchants of doubt. Bloomsbury Publishing. 
  3. Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When discredited information in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20 (6), 1420-1436.
  4. Ecker, U. K., Lewandowsky, S., Swire, B., & Chang, D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18, 570-578.
  5. Skurnik, I., Yoon, C., Park, D., & Schwarz, N. (2005). How warnings about false claims become recommendations. Journal of Consumer Research, 31, 713-724.
  6. Weaver, K., Garcia, S. M., Schwarz, N., & Miller, D. T. (2007). Inferring the popularity of an opinion from its familiarity: A repetitive voice sounds like a chorus. Journal of Personality and Social Psychology, 92, 821-833. 
  7. Schwarz, N., Sanna, L., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight:Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology, 39, 127-161.
  8. Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32, 303-330.