Cognitive dissonance

From RationalWiki
Jump to navigation Jump to search
Illustration for Aesop's fable, "The Fox and the Grapes",Wikipedia in which the fox exhibits cognitive dissonance.
Tell me about
your mother

Psychology
Icon psychology.svg
For our next session...
Popping into your mind
Not to be confused with the podcast of the same name. (Or the EarthBound fan-game.)
No man, for any considerable period, can wear one face to himself and another to the multitude, without finally getting bewildered as to which may be the true.
—Nathaniel Hawthorne, The Scarlet Letter

Cognitive dissonance is a psychological term which describes the uncomfortable tension that results from having two conflicting thoughts at once, or from engaging in behavior that conflicts with one's beliefs. It is also a description of the behaviors that allow people to soothe, override, or otherwise overcome such dissonance.

A very simple example of this involves the act of giving blood.Wikipedia You are there and it is uncomfortable, but you know it is a good and necessary thing to do. So when asked "Are you comfortable?", you lie without thinking and say "everything is fine." You may even exaggerate and say out loud with a smile on your face, "it's great", even though there's a 19-inch needle in your arm. Then, having said you are fine, your brain subconsciously begins to convince yourself that you are fine to alleviate the cognitive dissonance. This entire process is studied under the rubric of "cognitive dissonance".

Some famous non-religious examples include:

  • After feeling "buyer's remorse"Wikipedia and being unable to return something, convincing yourself that "no, it really was a good purchase" (purchase justification).Wikipedia
  • Justifying the work you do as important to your community or yourself.
  • Believing that fault or error (bad grades on a test, not getting a promotion, getting into an accident, etc.) must be someone else's faultWikipedia and not yours.
  • Insisting that anyone in competition with you is bad, evil, and/or out to get you.

Resolving dissonance[edit]

The concept of cognitive dissonance was developed and tested by observing some cults and observing how they reacted when their beliefs (in the end of the world) were shattered (by the world simply not ending), first and most famously in Leon Festinger, Henry Riecken, and Stanley Schacter's When Prophecy Fails.[1][2] As the sensation of dissonance is very unpleasant, most people tend to resolve it by converting their knowledge, beliefs, behaviours, and perceptions so that they are consistent between each other. Sounds logical, indeed, but there is a catch: the resolution is usually through the path of least psychological resistance.Wikipedia For example, when the cults' prophecies were proved to be wrong, the followers' faith didn't diminish; to the contrary, it strengthened, because it is much easier to simply disavow pieces of evidence as "false", put up an excuse, and keep on believing, than it is to change a belief that has grown to be an individual's entire soul, fiber, and character.Wikipedia Even their memories are distorted; one such person claimed that the date of the world's end had never been given with certainty, and evinced genuine surprise when his own words were played back saying that the world would absolutely totally for sure end on that date.

A prominent political example of resolving dissonance in this way can be found in the various smear attempts made against US President Barack Obama throughout 2008 and 2009. WorldNetDaily editor Joseph Farah — a key player in the Birther movement — has a firm belief that Obama is not American, and evidence to the contrary was dismissed as insufficient or fraudulent. Evidence for such beliefs, on the other hand, is usually blindly accepted by way of confirmation bias.

In summation, the fact that many people can be persuaded to accept a (poorly-constructed) argument, despite — or even because of — it being riddled with logical fallacies, can often be explained by acceptance of the argument producing less cognitive dissonance in the audience than rejection of the argument would.[citation needed]

Many social engineeringWikipedia techniques, such as milk before meat and foot-in-the-door salesmanship,Wikipedia owe a large part of their success to the exploitation of cognitive dissonance.[citation needed] For example, cognitive dissonance is a large part of why hazing builds loyalty;Wikipedia if you go through a rough initiation to get into a fraternity, you'll go to great lengths to convince yourself that the organization is awesome enough to have been worth it. Similarly, end of the world cultists often give away everything they own shortly before the appointed date, and will go to great lengths to avoid thinking it was for nothing.

Creationists and cognitive dissonance[edit]

Creationists have particular problems with scientific concepts that conflict with (read: flat-out disprove) a 6,000-year-old earth and a global flood occurring 4,300 years ago. This causes them to propose silly stuff like baramins, a changing speed of light, problems with radiometric dating, and the like.

Cumulative cognitive dissonance[edit]

People don't like to think. If one thinks, one must reach conclusions.Wikipedia Conclusions are not always pleasant.
—Attributed to Helen Keller (possibly erroneously)

Most people will (eventually) change their beliefs on a subject after enough contradictory evidence emerges, because sometimes said evidence is so solid and undeniable that it is easier to give up a complex worldview than to constantly generate excuses why the evidence against the worldview is and must be false. Other individuals, especially when they have support networks of others reinforcing a delusion or worldview, will go to such great lengths to rationalize away dissenting ideas that after a certain point, an admission of error would cause the collapse of an entire web of mutually-supporting beliefs. This would leave the brain with no ability to do its work, as everything it thought it knew would now be useless, resulting in agony/extreme fear of death and the activation of emergency self-protection mechanisms. Those mechanisms cause the individual to either go into an introverted reaction, with all-encompassing (willful) ignorance and cutting off any contact to those conflicting parts of the real world, or an extroverted reaction of trying to attack and destroy the sources of the conflicting information for heresy.

By comparison, a human being who would actually go through such an event without that protection would end up in a state of complete inability to accept himself/herself and to choose the “right” actions for even the simplest situations, making it impossible for him/her to continue living. So the former protective reactions are still the better (because it means survival) of three really bad choices.

See also[edit]

External links[edit]

References[edit]

  1. When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the Destruction of the World by Leon Festinger et al. (1956) Martino Publishing. ISBN 1578988527.
  2. Classic Book: When Prophecy Fails by Leon Festinger reviewed by M. G. Saldivar (May 25, 2011) Cognitive Science Blog.