Select Page

Ethan Porter was about to write the most difficult sentence of his PhD program. He opened his laptop. Looking at the keyboard he reflected on the “nightmare stories” he’d heard about how professors had treated junior scholars when they critiqued their work (Mantzarlis, 2016a). Sheepishly, he wrote to Professors Nyhan and Reifler: “Ok, guys, hope you’re well. Um, here’s our paper.” As he would admit later, “The crazy thing about this study is that we thought we did something wrong.” (McRaney, 2018).

9/11, war and the historical context

This is a story of human mistakes and mistakes inherent in the scientific process. To understand these mistakes and what happened next, we need to understand the context of this controversy. We need to go back to the year 2001, when the United States was attacked by a group of 19 al-Qaeda operatives, causing significant damage to the Pentagon, totally destroying the “Twin towers” of New York’s World Trade Centre, and killing 2996 people, including themselves (Kean, 2011). The scale and suddenness of these attacks were unprecedented, at least in the American experience, and left a deeply profound psychological impression upon the country’s people for decades to come (Garfin, Poulin, Blum, & Silver, 2018; Silver, 2002).

On 19 March 2003, the United States responded by invading Iraq, largely under the pretence of dismantling the country’s supposed “weapons of mass destruction” program (Marr, 2011, p. 384). President Bush and his allies argued that this program had to be stopped with military force (Herring and Robinson, 2014), and used propaganda to justify his actions (Kumar, 2006; Miller, 2003). Despite these assertions, there has never been any evidence that Iraq ever had these weapons or a program to make them (Chilcot, 2016).

Brendan Nyhan was one American who was deeply affected by these events. The year the attacks happened Brendan graduated with a degree in political science and took a role as deputy communications director in a Democratic senate campaign. He lamented how low the quality of political debate had become. “The 2000 campaign was something of a fact-free zone,” he would later say (Konnikova, 2014). 

Along with two other left-leaning friends, he founded Spinsanity, a website which carefully checked the facts behind dodgy claims made by politicians and pundits, well before sites like Politifact and Factcheck.org sprung up. By the time the war and its false justification came, Nyhan didn’t just wish that people would take facts seriously, he became obsessed by it. Spurred by his website’s success, in 2004 he and Spinsanity’s co-founders wrote the New York Times bestseller All the President’s Spin: George W. Bush, the Media, and the Truth, a highly critical account of the media tactics used to justify the war that by now had reached its 500th day (Fritz, Keefer, & Nyhan, 2004). The next year he graduated with a masters degree in political science from Duke University and then enrolled in the institution’s PhD program, eventually producing a dissertation titled The Politics of Presidential Scandal in 2009.

The original study

While in the final years of this program, Nyhan produced his most famous work. Working with Jason Reifler, a young political science lecturer at Georgia State University, he wrote a paper called When Corrections Fail: The Persistence of Political Misperceptions (Nyhan and Reifler, 2010). Using quantitative analyses, the paper measured how people reacted when they read news stories that contained a misleading claim made by a politician that was subsequently corrected. Understanding how effective these corrections are is important because democracies work best when voters are properly informed about the claims their political candidates make. But, as the authors said, “…many citizens may base their policy preferences on false, misleading, or unsubstantiated information that they believe to be true,” (Nyhan and Reifler, 2010, p. 304).

But this kind of research can be difficult. The human brain is regarded as the most complex object in the known universe (Kaku, 2015). This complexity is not just in the various neurological and biochemical mechanisms that the brain uses, it’s also in the social and psychological factors that humans use to make decisions. The multifactorial clash of elements like beliefs, attitudes, behavioural intentions and worldviews make it very difficult to separate the signal from the noise in social psychological research, and great care needs to be taken to avoid mistakes.

Regardless, Nyhan and Reifler sought to find out what happens when fact-checkers, like Nyhan’s Spinsanity team, try to correct people’s false beliefs. Do people simply accept this new information and update their beliefs, or do they simply ignore them? Nyhan and Reifler found that neither of these outcomes was true; they discovered something much more interesting. Their study stated that people who read a correction that ran counter to their political identities were more likely to believe the misinformation than they were before. They reported that corrections “actually strengthen misperceptions among ideological subgroups in several cases”. (Nyhan and Reifler, 2010, p. 323). They reported that they’d discovered a new form of confirmation bias and they called it “the backfire effect,”[1] (Nyhan and Reifler, 2010, p. 307). Their argument was that by showing people a correction, people who don’t want to believe it were forced to counterargue against it in their minds, and that this mechanism causes them to be even more confident in their beliefs than before.

But their evidence wasn’t uniform across the four studies within their paper.  Firstly, they only found the backfire effect in people who were politically conservative, and secondly, they only found this effect in a question about whether or not Iraq had had a “weapons of mass destruction” program in the lead up to the Iraq war – the subject of only one of the four studies in the paper. Nyhan was cautious about drawing conclusions about the prevalence of this effect from his sample of 200 people, writing “…it would be valuable to replicate these findings with non-college students or a representative sample of the general population,” (Nyhan and Reifler, 2010, p. 324).

Naturally, Nyhan then tried to replicate his findings in larger samples. He published another study with Reifler and another author that appeared to confirm the backfire effect in the politically polarising subject of the proposed Affordable Care Act, with a much larger, but non-representative sample of 948 people (B. Nyhan, Reifler, & Ubel, 2013). Nyhan went on to further investigate the backfire effect, co-authoring two papers which found that correcting people’s misconceptions about vaccines did not change their behavioural intentions (Nyhan and Reifler, 2015; Nyhan, Reifler, Richey, & Freed, 2014). But curiously, Nyhan did not publish one study, which found that fact checks did correct people’s beliefs in a nationally-representative sample of 1000 American voters, providing evidence against his backfire effect theory (Nyhan and Reifler, 2016).

Meanwhile, other authors had started to investigate the backfire effect. A team of Australian and American authors couldn’t find what Nyhan and Reifler had reported in a US sample which was asked about a range of factual misstatements made by then presidential candidate Donald Trump, finding instead that people did update their beliefs when presented with new information. They reported that there was “…no evidence for a worldview backfire effect…” (Swire, Berinsky, Lewandowsky, & Ecker, 2017, p. 17).

The controversy leaks

Sometimes new scientific research causes controversy only among the researchers interested in the field, and sometimes that controversy leaks out to become publicly controversial among those who have a stake in the findings. For many years Nyhan’s findings were debated by the scientific community but soon the debate flowed through to the professional fact-checkers, because the implications for them were enormous. If correcting someone’s mistaken belief could make them believe debunked information even more strongly, that cast doubt on the effectiveness on anyone who relied on objective evidence, not just political reporters, but scientists, economists, lawyers, historians and others.

Initially the findings didn’t receive a lot of media attention. While Nyhan was still a PhD student, a preprint version of his original paper was reported in the Washington Post and he was interviewed on NPR. But in the following years, in America and in countries around the world, the topic of whether correcting facts was a useful activity or not started to become increasingly popular – and so did the belief in backfire effect. Academics and the media suggested that people’s disregard for facts, both in the political and scientific realms, was leading to a “post-truth” world, dominated by “fake news” (Davies, 2016; Sismondo, 2017). The “backfire effect” fed into this idea. People seemingly now had an explanation for why they couldn’t convince people to change their minds on things like vaccines, genetically modified organisms and climate change.

The original paper has been cited 1,188 times – in other academic papers, in newspaper articles and in guides on how to conduct scientific investigations ranging from Doing survey research: A guide to quantitative methods (Nardi, 2018) to Cognitive errors and diagnostic mistakes – A case-based guide to critical thinking in medicine (Howard, 2018). It was also excellent material for popular psychology with David McRaney’s podcast You are not so smart dedicating three episodes explaining the evidence for it (McRaney, 2019). Figure 1 shows a scene from an episode of popular TV show Adam Ruins Everything dedicated to the backfire effect, whichwent even further, with one character stating “The more you prove someone wrong, the more they think they’re right,” (Murphy, 2017).

Figure 1. A scene from Adam Ruins Everything – The Backfire Effect (Murphy, 2017)

But perhaps the biggest popular impact was the publication of a simple comic. Partly inspired by celebrity businessman Donald Trump’s election to the US presidency, The Oatmeal published a long-form comic strip suggesting the backfire effect was to blame for any issue where you couldn’t change someone’s mind, a part of which is shown in Figure 2. The Oatmeal’s Facebook post alone was shared more than 91,000 times (Inman, 2017).

Figure 2. You’re not going to believe what I’m about to tell you from The Oatmeal (Inman, 2017).

Not everyone was comfortable with the idea of the backfire effect. Professional fact-checkers who had followed Nyhan’s lead into the fact-checking business began to get concerned about the implications if the backfire effect was true, and they voiced those concerns in the media at the time. “Why Fact-Checking Donald Trump Backfires,” explained Slate (Kopplin, 2015). “Fact-checking: does anyone even care?” asked Poynter (Mantzarlis, 2016b). “The Death Of Facts In An Age Of ‘Truthiness,’” NPR declared solemnly to its listeners (Raz, 2012). Even the Columbia Journalism Review, perhaps the world’s most popular publication for journalists, said about the backfire effect, “The media is ineffective at dispelling false rumors,” (Silverman, 2011). It wasn’t just fact-checkers that started to dispute the effectiveness of facts. Science communicators added their concerns to the mix. Science journalist Chris Mooney, writing in Mother Jones, reported in a story titled, “The Science of Why We Don’t Believe Science,” that “In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever,” (Mooney, 2011). Scientific American seemingly threw up its collective hands about its own work and asked, “Is truth an outdated concept?” (Shermer, 2019). The fact-checking business was under fire because of research done by one of its founding fathers.

Porter and Wood

Two young PhD students also had concerns. Like Nyhan before him, Ethan Porter was a Democrat supporter. The year that Nyhan completed the first draft of his original paper, Barack Obama was elected president, the culmination of a campaign Porter had joined the previous year. He later enrolled in a PhD program at the University of Chicago. It was there that he met Thomas Wood – then a PhD student in the final years of his program. Wood was an Australian who had studied his part of undergraduate degree in America. He’d returned to Australia to work in the Department of Defence, but came back to America to satisfy his interest in US politics and the more quantitative approach to political science that was available there (Ohio State University, 2014).

The failed replication

Porter and Wood had similar academic interests and came up with a research proposal that they knew would make a professional name for themselves. They decided they would try to expand upon the Backfire Effect, which Wood described as a “citation monster” (Engber, 2018). Their idea was to recruit the largest sample by far and see which particular issues caused the greatest backfire effect in which political identities. Following a similar methodology to (Nyhan and Reifler, 2010)  they recruited 8,100 participants and looked for the backfire effect across 36 different and publicly contentious issues (Thomas Wood and Porter, 2016). But when they began their analysis, they were shocked, as Porter explains:

We were like “wait a second, there’s no backfire here!” I remember looking at the results and saying “gosh, we must have screwed something up. Where’s the backfire?”

(McRaney, 2018)

Across all 36 issues they found evidence of the backfire effect in only one issue: whether weapons of mass destruction were found in the Iraq war -the same issue that Nyhan had discovered in his original study. The prevalence of the backfire effect was seriously in doubt. But then they found something even worse. This effect was nullified if they used a simplified version of the weapons of mass destruction question that Nyhan used (Thomas Wood and Porter, 2016, p. 34).

They then recruited even more participants to try to find backfire in several more topics as Porter describes:

We went all-out. We then tried to be more and more aggressive. Like Obama talking about race. Or Trump talking about race. Thinking we were going to directly assail people with a white racial identity or no white racial identity, to do whatever it took to induce backfire. 

(McRaney, 2018).

In all they tried to replicate Nyhan’s findings in 52 different topics with more than 10,000 participants. But after all this work they only found evidence of the backfire effect in one question – weapons of mass destruction in the Iraq war, and only when using the very complicated question that Nyhan had used (Thomas Wood and Porter, 2016).

Porter contacts Nyhan

This is why Porter was so nervous when he emailed his paper to Nyhan. Being early career researchers, Porter and Wood knew that the accidental debunking of Nyhan’s work was going to be controversial and was almost certain to be questioned, not only by Nyhan but by all the people who now assumed the backfire effect was prevalent.

The mistakes and the challenges of communicating science

There are several reasons why this controversy arose and persisted for so long. Nyhan and Reifler weren’t specific enough about their variables, particularly their dependent variable. A similar problem was described by Collins and Pinch (2012) in their discussion about the supposed chemical transfer of memory, “Whether or not McConnell’s results could be replicated by others, or could be said to be replicable, depended on common agreement about what were the important variables in the experiment,” (Collins and Pinch, 2012, p. 11). This also occurred in the story of cold fusion: “To those who failed to find anything, however, this was simply confirmation that there was nothing to be found” (Collins and Pinch, 2012, p. 69). Certainly, other researchers had investigated the backfire effect but gave up after failing to find it (Aird, Ecker, Swire, Berinsky, & Lewandowsky, 2018; Swire, et al., 2017).

Changing someone’s mind is often a part of science communication. So, what does it mean to change someone’s mind, or to change misperceptions as Nyhan called it (Nyhan and Reifler, 2010)? In the various studies into the backfire effect, different authors had used terms like belief, attitude, behavioural intention and worldview almost interchangeably, and almost any negative result was labelled the backfire effect. Thinking back to the literature, Wood says “…there was a ton of conceptual slippage going on” (McRaney, 2018). But these terms mean different things in social psychology. Your attitudes are your general predisposition towards a person, an object or an issue (Vaughan and Hogg, 2013). You probably have an attitude towards or against a political leader or a football team or a type of cuisine. But your beliefs are different. Your beliefs are those propositions that you understand to be true (Connors and Halligan, 2015). They are assessments of facts.

Nyhan could have avoided confusion if he was more precise with his definitions. What the totality of his studies actually found was the same thing as Porter and Wood’s; that people generally update their beliefs about a topic, but they don’t easily shift their attitudes. This is an important distinction because it’s not as if facts don’t have any impact, it’s just that people tend to justify their attitudes by using new beliefs. For example, if you tell people who are strongly against vaccinations that there is no elemental mercury in vaccines and show them the evidence, they may well change their beliefs about that. But they may still refuse to vaccinate their children by then shifting the debate, perhaps by claiming that vaccines cause autism. Their beliefs may change but their attitudes remain.

The reason why belief in the backfire effect persisted for so long is a facet of the scientific process itself. Social psychology in particular has had to confront its “replication crisis”, where several effects previously thought to be well-established could not be replicated by later research (Yong, 2012).  While some scientists defend this as the “self-correcting” mechanism of science, with findings tested again and again to build up a body of knowledge, there are many examples where this replication work hadn’t happened for many years, leaving scientists and stakeholders under the misunderstanding that the original findings were reliable (Ioannidis, 2012).

One survey of 1500 scientists across fields like biology, medicine, engineering and chemistry found that more than 70 per cent of scientists had tried to replicate another scientist’s work and had been unable to do so (Baker, 2016). This is a clear example of the symmetry principle at work. The mistakes are quickly blamed on humans acting in good faith, but in reality, the causes are within the current scientific publishing system. Academic journals are often not interested in publishing replications – they are much more eager to publish new, counterintuitive, and even controversial findings that attract readers and media attention. Even Wood and Porter’s compelling and credible non-replication of Nyhan’s work was rejected by two journals before it was eventually published by Political Behavior (Engber, 2018). Everett and Earp (2015) describe the crisis as a particular “social dilemma” for early career researchers, baked in to the way science is done. Less experienced researchers are given no incentive to conduct the replications that science needs to be reliable, as the incentive to “publish or perish” is particularly relevant to those researchers who haven’t had their work published in scientific journals before. It also leads to the “file-drawer” problem, where researchers who simply can’t replicate an interesting finding discard their research knowing that it is unlikely to be published or that they might have performed the research badly (Franco, Malhotra, & Simonovits, 2014).

Resolution

Many controversies in science simply end like this – two groups of scientists reach two different conclusions, stakeholders remain confused, with no clear resolution. But that is not how this story ends. When Porter emailed now Professor Nyhan his reaction wasn’t anything like the “nightmare story” Porter had feared. Nyhan welcomed the contradictory research and then quickly contacted Wood, Porter and his old research partner Reifler, to work together to get to the bottom of the backfire effect. As Porter explains:

Brendan and Jason were like total mensches.[2] They were willing to think through and re-think this finding, and work to design a new experiment to test out the backfire effect once more.

(McRaney, 2018).

As Nyhan dryly explains, “It would be a terrible irony if evidence contradicting the backfire effect provoked me into doubling down on the backfire effect,” (Nyhan, 2017).

Pooling their experience and their resources they conducted two more studies with large samples. They couldn’t find the backfire effect, finding that while attitudes remained firm, the participants updated their beliefs, just as Porter and Wood had found. In a forthcoming paper the four authors wrote that “As with other recent research (Wood and Porter 2018), we find little evidence of a backfire effect on respondents’ factual beliefs.” (Nyhan, Porter, Reifler, & Wood, 2019, p. 26).

A model example of science at work?

Some authors have suggested that the collaboration provides a model of how science should work (Resnick, 2017). Ethan Porter hints at this and the symmetry problem in his description of the backfire effect’s replication projects:

I’d love to think of our study as providing a model for people who have had replication troubles. If a researcher has trouble replicating another researcher’s work, those researchers should work together. This assumes good faith on everybody’s part. This is how science progresses, this is how people can actually learn more, which is precisely what we did. The paper is now public and we put it out together.

(McRaney, 2018).

Clearly the interpretation of Nyhan’s work both by the authors themselves, by other academics and by science communicators was subject to human biases. However, these were only identified to explain away the mistake in an asymmetric way. Likewise the “success” of the collaboration has been explained as research done with “better research methods” (Engber, 2018). That is, success isn’t human, it is better science.

This indeed would be a model of how science should work, if not for the time lag between the initial study, published in 2010, and the eventual conclusion of Nyhan, Reifler, Wood and Porter’s collaboration, which has not been formally published (Nyhan, et al., 2019). As previously shown, in the intervening period many people erroneously thought that facts were irrelevant, often because they had read about the backfire effect. And now science will have to address what economist Paul Krugman calls a “zombie idea” – an idea that “…should have died long ago in the face of evidence that undermines its basic premise, but somehow just keeps shambling along” (Krugman, 2015). Now that so many people believe in the backfire effect, how do we correct them?

The story concludes

Wood and Porter haven’t give up on finding the backfire effect in other topics, continuing the research after their collaboration with Nyhan (see Figure 3), describing the search for it as their “white whale”. But, as Wood concludes, “Backfire is very unusual, and I don’t think it should be something that affects the way fact-checkers work.” (Mantzarlis, 2016a)

Figure 3. The hunt for the backfire effect continues (Tom Wood, 2018)

The lessons for science communicators

Science communicators should be precise when communicating about science. The story of the backfire effect isn’t just a story about how science misled people, it’s also a story of how science communicators misled people. Those that sought to describe the backfire effect research made the same mistakes the researchers did, in that they did not carefully describe the dependent variable. In many respects, science communicators should be like fact-checkers: selecting, verifying and presenting information to their audiences. But they did not effectively do this until Wood and Porter accidentally debunked the backfire effect.

Science communicators should be aware of the biases that may affect our profession. For example, science communicators may be primed to believe in the backfire effect. Science communicators are often trained about the limitations of facts in their study of the “information deficit model,” which falsely suggests that people will simply update their knowledge when new facts are presented to them (Bucchi and Trench, 2008; Gilbert and Stocklmayer, 2013). It is incumbent on science communicators to not only know the persuasive limits of facts, but to also know how and when they can be used effectively.

Science communicators should be wary of hype. Much of the coverage of the backfire effect engaged in hype. While the evidence for the backfire effect was never very clear, the reporting of it often suggested it was highly prevalent, as we have seen. The apparent discovery of the backfire effect led to “overheated expectations” that may have weakened efforts to correct false beliefs in practice (Brown, 2003). 

Science communicators should look at the totality of evidence before communicating about science. This includes looking for replications and the absence of replications. Science communicators should become adept at communicating uncertainty where evidence isn’t strong, particularly in newly evolving areas of science.

Finally, more so than ever, science communicators need to expand the research about how to correct false beliefs and change attitudes. This is a known knowledge gap in the scientific literature (National Academies of Sciences, 2017). Misinformation is now able to spread rapidly across social media networks (Vosoughi, Roy, & Aral, 2018) and organised disinformation campaigns are implemented by foreign powers (Davis, 2019) Therefore, science communicators need to understand how we can get people to not only update their beliefs, but also change their attitudes to publicly controversial subjects like vaccinations, climate change and genetically modified organisms. To achieve this, science communicators should be encouraged to fill these research gaps by conducting research in this area, perhaps as part of multidisciplinary teams including researchers from psychology and political science.

References

Aird, M. J., Ecker, U. K. H., Swire, B., Berinsky, A. J., & Lewandowsky, S. (2018). Does truth matter to voters? The effects of correcting political misinformation in an Australian sample. R Soc Open Sci, 5(12), p 180593. doi:10.1098/rsos.180593

Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature News, 533(7604), p 452.

Brown, N. (2003). Hope against hype-accountability in biopasts, presents and futures. Science & Technology Studies

Bucchi, M., & Trench, B. (2008). Handbook of public communication of science and technology: Routledge.

Chilcot, J. (2016). The report of the Iraq inquiry. https://webarchive.nationalarchives.gov.uk/20171123123237/http://www.iraqinquiry.org.uk/

Collins, H. M., & Pinch, T. (2012). The golem: What you should know about science: Cambridge University Press.

Connors, M. H., & Halligan, P. W. (2015). A cognitive account of belief: a tentative road map. Frontiers in Psychology, 5, pp. 1588-1588. doi:10.3389/fpsyg.2014.01588 Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/25741291

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4327528/

Cook, J., & Lewandowsky, S. (2012). The Debunking Handbook. St Lucia, Australia: U. o. Queensland. http://sks.to/debunk

Davies, W. (2016). The age of post-truth politics. The New York Times, 24, p 2016.

Davis, M. (2019). ‘Globalist war against humanity shifts into high gear’: Online anti-vaccination websites and ‘anti-public’ discourse. Public Understanding of Science, 28(3), pp. 357-371. doi:10.1177/0963662518817187 Retrieved from https://journals.sagepub.com/doi/abs/10.1177/0963662518817187

Engber, D. (2018, 3 January 2018). LOL Something Matters. Slate. Retrieved from https://slate.com/health-and-science/2018/01/weve-been-told-were-living-in-a-post-truth-age-dont-believe-it.html

Everett, J., & Earp, B. (2015). A tragedy of the (academic) commons: interpreting the replication crisis in psychology as a social dilemma for early-career researchers. [Opinion]. Frontiers in Psychology, 6(1152)doi:10.3389/fpsyg.2015.01152 Retrieved from http://journal.frontiersin.org/article/10.3389/fpsyg.2015.01152

Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 345(6203), pp. 1502-1505. doi:10.1126/science.1255484 Retrieved from https://science.sciencemag.org/content/sci/345/6203/1502.full.pdf

Fritz, B., Keefer, B., & Nyhan, B. (2004). All the president’s spin: George W. Bush, the media, and the truth: Simon and Schuster.

Garfin, D. R., Poulin, M. J., Blum, S., & Silver, R. C. (2018). Aftermath of Terror: A Nationwide Longitudinal Study of Posttraumatic Stress and Worry Across the Decade Following the September 11, 2001 Terrorist Attacks. Journal of Traumatic Stress, 31(1), pp. 146-156. doi:10.1002/jts.22262 Retrieved from https://dx.doi.org/10.1002/jts.22262

Gilbert, J. K., & Stocklmayer, S. (2013). Communication and engagement with science and technology: Issues and dilemmas: A reader in science communication: Routledge.

Herring, E., & Robinson, P. (2014). Report X marks the spot: The British government’s deceptive Dossier on Iraq and WMD. Political Science Quarterly, 129(4), pp. 551-584.

Howard, J. (2018). Cognitive Errors and Diagnostic Mistakes: A Case-Based Guide to Critical Thinking in Medicine: Springer.

Inman, M. (2017). You and not going to believe what I’m about to tell you. Retrieved Date Accessed, 2019  from https://twitter.com/Oatmeal/status/859511342981496832/photo/1?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E859511342981496832&ref_url=https%3A%2F%2Fmashable.com%2F2017%2F05%2F06%2Foatmeal-backfire-effect-comic%2F.

Ioannidis, J. P. A. (2012). Why Science Is Not Necessarily Self-Correcting. 7(6), pp. 645-654. doi:10.1177/1745691612464056 Retrieved from https://dx.doi.org/10.1177/1745691612464056

Kaku, M. (2015). The future of the mind: The scientific quest to understand, enhance, and empower the mind: Anchor Books.

Kean, T. (2011). The 9/11 commission report: Final report of the national commission on terrorist attacks upon the United States: Government Printing Office.

Konnikova, M. (2014, 16 May 2014). I don’t want to be right. The New Yorker. Retrieved from https://www.newyorker.com/science/maria-konnikova/i-dont-want-to-be-right

Kopplin, Z. (2015, 30 December). Why Fact-Checking Donald Trump Backfires. Slate. Retrieved from https://slate.com/technology/2015/12/fact-checking-trump-can-backfire-due-to-motivated-reasoning.html

Krugman, P. (2015). Zombies of 2016. New York: New York Times Company.

Kumar, D. (2006). Media, War, and Propaganda: Strategies of Information Management During the 2003 Iraq War. Communication and Critical/Cultural Studies, 3(1), pp. 48-69. doi:10.1080/14791420500505650 Retrieved from https://dx.doi.org/10.1080/14791420500505650

Mantzarlis, A. (2016a, 2 November 2016). Fact-checking doesn’t ‘backfire,’ new study suggests. Poynter. Retrieved from https://www.poynter.org/fact-checking/2016/fact-checking-doesnt-backfire-new-study-suggests/

Mantzarlis, A. (2016b). Fact-checking: does anyone even care? Poynter. Retrieved from https://www.poynter.org/fact-checking/2016/fact-checking-does-anyone-even-care/

Marr, P. (2011). The Modern History of Iraq Boulder, UNITED STATES: Westview Press.

McRaney, D. (2018). YANSS 120 – The Backfire Effect – Part Four.

McRaney, D. (2019). You are not so smart: A celebration of delusion. Retrieved Date Accessed, 2019  from https://youarenotsosmart.com/.

Miller, D. (2003). Tell me lies: Propaganda and media distortion in the attack on Iraq: Pluto Press.

Mooney, C. (2011). The science of why we don’t believe science. Mother Jones, 11Retrieved from https://www.motherjones.com/politics/2011/04/denial-science-chris-mooney/

Murphy, L. (Writer). (2017). Adam Ruins Everything – Episode 37, The Backfire Effect. In J. Reitz (Producer).

Nardi, P. M. (2018). Doing survey research: A guide to quantitative methods: Routledge.

National Academies of Sciences, E. a. M. (2017). Communicating science effectively: A research agenda 9780309451024

0309451027). Washington (DC): T. N. A. Press.

Nyhan, B. (2017). Walking Back the Backfire Effect. In B. Glandstone (Ed.), On the Media. New York, USA: WNYC.

Nyhan, B., Porter, E., Reifler, J., & Wood, T. J. (2019). Taking Fact-checks Literally But Not Seriously? The Effects of Journalistic Fact-checking on Factual Beliefs and Candidate Favorability. Political Behavior, pp. 1-22.

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), pp. 303-330.

Nyhan, B., & Reifler, J. (2015). Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine, 33(3), pp. 459-464. doi:10.1016/j.vaccine.2014.11.017 Retrieved from https://dx.doi.org/10.1016/j.vaccine.2014.11.017

Nyhan, B., & Reifler, J. (2016). Do people actually learn from fact-checking? Evidence from a longitudinal study during the 2014 campaign. Unpublished manuscript

Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective messages in vaccine promotion: a randomized trial. Pediatrics, 133(4), pp. e835-e842.

Nyhan, B., Reifler, J., & Ubel, P. A. (2013). The hazards of correcting myths about health care reform. Med Care, 51(2), pp. 127-132. doi:10.1097/MLR.0b013e318279486b

Ohio State University. (2014). Department Welcomes Thomas Wood. Retrieved Date Accessed, 2019  from https://polisci.osu.edu/news/department-welcomes-thomas-wood.

Raz, G. (2012). he Death Of Facts In An Age Of ‘Truthiness’. Washington DC, USA: NPR.

Resnick, B. (2017, 10 July). Trump supporters know Trump lies. They just don’t care. Vox. Retrieved from https://www.vox.com/2017/7/10/15928438/fact-checks-political-psychology

Shermer, M. (2019, 1 March). Is Truth an Outdated Concept? . Scientific American.

Silver, R. C. (2002). Nationwide Longitudinal Study of Psychological Responses to September 11. JAMA, 288(10), p 1235. doi:10.1001/jama.288.10.1235 Retrieved from https://dx.doi.org/10.1001/jama.288.10.1235

Silverman, C. (2011, 27 May). “Death Panels” Report Reaches Depressing Conclusions. Columbia Journalism Review. Retrieved from https://archives.cjr.org/behind_the_news/death_panels_report_reaches_de.php

Sismondo, S. (2017). Post-truth? Social Studies of Science, 47(1), pp. 3-6. doi:10.1177/0306312717692076 Retrieved from https://dx.doi.org/10.1177/0306312717692076

Swire, B., Berinsky, A. J., Lewandowsky, S., & Ecker, U. K. H. (2017). Processing political misinformation: comprehending the Trump phenomenon. 4(3), p 160802. doi:10.1098/rsos.160802 Retrieved from https://dx.doi.org/10.1098/rsos.160802

Vaughan, G. M., & Hogg, M. A. (2013). Social Psychology: Pearson Higher Education AU.

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), pp. 1146-1151. doi:10.1126/science.aap9559 Retrieved from http://science.sciencemag.org/content/sci/359/6380/1146.full.pdf

Wood, T. (2018). Because the internet demanded it, the search for *any* backfire goes on. Fake News and Misinformation, Salon 2 3rd floor, 9:45am #MPSA18 @EthanVPorter. In @thomasjwood (Ed.).

Wood, T., & Porter, E. (2016). The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior, pp. 1-29.

Yong, E. (2012). Replication studies: Bad copy. Nature News, 485(7398), p 298.


[1] Technically the “Worldview backfire effect”, see Cook and Lewandowsky (2012, pp. 2-4)

[2] A mensch is a person of integrity, from the Hebrew language