Backfire effect
Tell me about your mother Psychology |
For our next session... |
Popping into your mind |
“”What should be evident from the studies on the backfire effect is you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel as though they are even more sure of their position than before you started the debate. As they match your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs.
|
—You Are Not So Smart — The Backfire Effect[1] |
The backfire effect is a possible psychological effect that was originally proposed by Brendan Nyhan and Jason Reifler in 2010 based on their research of a single survey item among conservatives,[2][3]
“”Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.
|
The effect was subsequently supported by other studies,[4][5][6][3] although other studies found no evidence for the claim, or that it was only suggested under certain circumstances. The existence, strength, and applicability of the backfire effect are all currently under debate and study, with the best available evidence suggesting that it doesn't actually exist. Ironically, the idea of the effect has been confidently embraced in general and become something of a pop psychology phenomenon, despite the dubious evidence.
The effect is claimed to be that when, in the face of contradictory evidence, established beliefs do not change but actually get stronger. The effect has been demonstrated experimentally in some psychological tests, where subjects are given data that either reinforces or goes against their existing biases — and in most cases, people can be shown to increase their confidence in their prior position regardless of the evidence they were faced with. In a pessimistic sense, this would make most refutations useless.
How it gets exploited[edit]
By knowing that one can profit from perpetuating a falsehood, one can exploit the backfire effect to one's own advantage. This happens for example when corporations know that they are doing something that is harmful but nonetheless promote denialism. The corporation can even publicly admit that their product is harmful while simultaneously covertly funding astroturf groups that promote the denialist message. Examples of this coverup behavior include the asbestos industry, the tobacco industry, the sugar industry, and the fossil fuel industry with regard to global warming.[7] The exploitation is not limited to industries, but also includes fundamentalist Christians (creationism) and alternative medicine promoters (anti-vaccination movement).[7]
How to fight it[edit]
“”Not taking ideas personally is made easier by the meta-belief that holding certain beliefs does not make you a better person.
|
—Peter Boghossian[8] |
Try to evaluate whether the person that you are speaking with is being intellectually dishonest, and therefore highly unlikely to change. Intellectual dishonesty can be due to a person having a vested interest in lying, whether that be financial (e.g., denial of climate change because of working for the fossil fuel industry) or a fixed worldview (e.g., lying for Jesus). If a person responds to logic and facts, they may be more likely to change.
- Let tempers cool down a bit before bringing a subject up again. A large portion of the backfire effect stems from people not wanting to be seen as wrong or stupid in front of an audience. Once the flame wars die down a bit, people will be more emotionally able to accept your viewpoint.
- Stymied by the opponent's selfishness? If possible, you should show your opponent how something would benefit them personally. A lot of people view certain positions as a zero sum game, and if you're able to show that it's not (or at least they'd get the long end of the stick), you can possibly bring them around.
- Just wait for a little bit. Sometimes what you said actually sank in to some extent; they just need some time to mull it over.
- Remember to ask questions — in fact, consider making at least half of your communications to them questions (genuine ones) instead of statements. Questions are often seen as less 'accusatory' and mulling over a question requires more thought than an angry knee-jerk reaction.
Of course, the most important thing to remember is that you yourself are just as vulnerable to it as your opponent. Make sure you don't fall victim to the backfire effect by carefully examining evidence that seems to contradict your preconceptions, and allowing for the possibility that you might have been wrong.
Failures to replicate[edit]
Some later research calls the existence of the backfire effect into question. Thomas Wood and Ethan Porter had a follow-up study where rather than surveying people about a single survey question in a single way, they asked 10,100 subjects about 52 different issues.[3] Wood and Porter only observed one instance of the backfire effect, and that was when using Nyhan & Reifler's original survey item as originally phrased.[3] When Wood and Porter simplified the phrasing of that survey item, they did not observe the backfire effect.[3] Overall, Wood and Porter's research casts doubt on the existence of the backfire effect.[3] The results were consistent with a very well-documented aspect of mass public opinion surveys, that respondents avoid thinking (cognitive effort).[3][9][note 1] This consistency likely also explains the difference between Wood and Porter's study and studies that confirmed the backfire effect, because many of those studies were conducted in academic settings, whose denizens are well-known for enjoying cognitive effort (e.g. coming up with counter-arguments in the face of facts).[3]
See also[edit]
- Blowback
- Confirmation bias — the converse effect
- Cognitive dissonance[note 2]
- Pommer's Law
- SIWOTI
- Streisand effect
External links[edit]
- The Backfire Effect: Why Facts Don’t Always Change Minds: an overview of the backfire effect, when and why it influences people, and how to counter it.
- Want to Win a Political Debate? Try Making a Weaker Argument, Pacific Standard Magazine
- The Backfire Effect: Why Facts Don't Win Arguments, BigThink
- The Elusive Backfire Effect: Mass Attitudes' Steadfast Factual Adherence, SSRN - couldn't find evidence of Backfire effect
- I used to be opposed to vaccines. This is how I changed my mind. by Rose Branigin (February 11, 2019) The Washington Post.
Notes[edit]
- ↑ To borrow a quotation from another of our articles:
“”People don't like to think. If one thinks, one must reach conclusions. Conclusions are not always pleasant.—Attributed to Helen Keller (possibly erroneously) - ↑ Further information can be found in that article, but to summarize the relationship between the subjects, the common (but, for lack of a better word, inefficient) defense strategies that most people naturally employ against cognitive dissonance may go a long way towards explaining the backfire effect (assuming, of course, that it actually exists). (And that's not even getting into the potential ugly consequences of those defense strategies backfiring and allowing the dissonance to reach critical mass…)
References[edit]
- ↑ You Are Not So Smart — The Backfire Effect
- ↑ When Corrections Fail: The Persistence of Political Misperceptions by Brendan Nyhan & Jason Reifler (2010) Political Behavior 32(2):303–330.
- ↑ 3.0 3.1 3.2 3.3 3.4 3.5 3.6 3.7 The Elusive Backfire Effect: Mass Attitudes’ Steadfast Factual Adherence by Thomas Wood & Ethan Porter (January 2018) Political Behavior pp. 1-29. doi:10.1007/s11109-018-9443-y.
- ↑ Nyhan, Brenda., J. Reifler, S. Richey, and G. L. Freed. "Effective Messages in Vaccine Promotion: A Randomized Trial." Pediatrics 133, 4 (2014): e835–42. doi:10.1542/peds.2013-2365.
- ↑ Nyhan, Brendan, Jason Reifler, and Peter A. Ubel. "The Hazards of Correcting Myths about Health Care Reform." Medical Care, 51, 2 (2013): 127–32. doi:10.1097/MLR.0b013e318279486b.
- ↑ Berinsky, Adam J. "Rumors and Health Care Reform: Experiments in Political Misinformation." British Journal of Political Science, June (2015): 1–22. doi:10.1017/S0007123415000186.
- ↑ 7.0 7.1 The Science of Why We Don't Believe Science: How our brains fool us on climate, creationism, and the vaccine-autism link by Chris Mooney (May/June 2011) Mother Jones.
- ↑ Not taking ideas personally is made easier by the meta-belief that holding certain beliefs does not make you a better person. by Peter Boghossian (10:27 PM - 23 Feb 2017) Twitter.
- ↑ Public Opinion by Walter Lippmann (1922) Harcourt, Brace and Company.