• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Cognitive dissonance

andyandy

anthropomorphic ape
Joined
Apr 30, 2006
Messages
8,377
I thought I would start a thread on this as I am currently reading
mistakes were made (but not by me)

which is an absolutely fascinating book on cognitive dissonance and how it affects what we think and how we think it.

The core of the book centres around the idea of cognitive dissonance, where the brain has to reconcile two contrasting viewpoints. For example the self belief that " I am rational and intelligent" with the action " I am slowly killing myself by smoking". The dissonance could be resolved by concluding that actually I am neither rational nor especially intelligent, but of course no one wants to conclude that! So instead I look for levers to reduce the gap in the other direction. Smoking helps me to relax, and stress is a big killer, smoking helps me to keep my weight down and obesity is a big health problem. And so on......

that idea in itself is not especially remarkable, but what is remarkable is the wealth of studies that investigate the impact of cognitive dissonance upon our day-to-day lives. Like for example how students who are made to conduct a rigorous initiation event prior to assessing the quality and usefulness of a recorded debate are far more likely to rate the debate as interesting and informative rather than students who are not required to go through such an initiation. The cognitive dissonance here is between the gap "I'm a rational and intelligent person" and "I've put myself through all this hard work to listen to this debate". Rather than conclude that we have wasted our time, which calls into question our intelligence, we instead resolve the dissonance by subconsciously overrating the usefulness or importance of what we have just listened to.

As I go through the book I will post some more studies on to thead, and if people wish to join in the discussion then that would be marvellous :)
 
For people who are interested in reading a bit more on this topic,

http://skepdic.com/cognitivedissonance.html

Marian Keech was the leader of a UFO cult in the 1950s. She claimed to get messages from extraterrestrials, known as The Guardians, through automatic writing. Like the Heaven's Gate folks forty years later, Keech and her followers, known as The Seekers or The Brotherhood of the Seven Rays, were waiting to be picked up by flying saucers. In Keech's prophecy, her group of eleven was to be saved just before the earth was to be destroyed by a massive flood on December 21, 1954. When it became evident that there would be no flood and the Guardians weren't stopping by to pick them up, Keech

became elated. She said she'd just received a telepathic message from the Guardians saying that her group of believers had spread so much light with their unflagging faith that God had spared the world from the cataclysm (Levine 2003: 206).

More important, the Seekers didn't abandon her. Most became more devoted after the failed prophecy. (Only two left the cult when the world didn't end.) "Most disciples not only stayed but, having made that decision, were now even more convinced than before that Keech had been right all along....Being wrong turned them into true believers (ibid.)." Some people will go to bizarre lengths to avoid inconsistency between their cherished beliefs and the facts. But why do people interpret the same evidence in contrary ways?

The Seekers would not have waited for the flying saucer if they thought it might not come. So, when it didn't come, one would think that a competent thinker would have seen this as falsifying Keech's claim that it would come. However, the incompetent thinkers were rendered incompetent by their devotion to Keech. Their belief that a flying saucer would pick them up was based on faith, not evidence. Likewise, their belief that the failure of the prophesy shouldn't count against their belief was another act of faith. With this kind of irrational thinking, it may seem pointless to produce evidence to try to persuade people of the error of their ways. Their belief is not based on evidence, but on devotion to a person. That devotion can be so great that even the most despicable behavior by one's prophet can be rationalized. There are many examples of people so devoted to another that they will rationalize or ignore extreme mental and physical abuse by their cult leader (or spouse or boyfriend). If the basis for a person's belief is irrational faith grounded in devotion to a powerful personality, then the only option that person has when confronted with evidence that should undermine her faith would seem to be to continue to be irrational, unless her faith was not that strong to begin with.

http://en.wikipedia.org/wiki/Cognitive_dissonance

In Festinger and Carlsmith's classic 1959 experiment, students were made to perform tedious and meaningless tasks, consisting of turning pegs quarter-turns and, another one, putting spools onto a tray, emptying the tray, refilling it with spools, and so on. Participants rated these tasks very negatively. After a long period of doing this, students were told the experiment was over and they could leave. This is an example of an induced compliance study.

However, the experimenter then asked the subject for a small favor. He was told that a needed research assistant was not able to make it to the experiment, and the participant was asked to fill in and try to persuade another subject (who was actually a confederate) that the dull, boring tasks the subject had just completed were actually interesting and engaging. Some participants were paid $20 for the favor, another group was paid $1, and a control group was not requested to perform the favor.

When asked to rate the peg-turning tasks later, those in the $1 group rated them more positively than those in the $20 group and control group. This was explained by Festinger and Carlsmith as evidence for cognitive dissonance. Experimenters theorized that people experienced dissonance between the conflicting cognitions "I told someone that the task was interesting", and "I actually found it boring". When paid only $1, students were forced to internalize the attitude they were induced to express, because they had no other justification. Those in the $20 condition, it is argued, had an obvious external justification for their behavior. Behavior internalization is only one way to explain the subject's ratings of the task. The research has been extended in later years. It is now believed that there is a conflict between the belief that "I am not a liar", and the recognition that "I lied". Therefore, the truth is brought closer to the lie, so to speak, and the rating of the task goes up.

The researchers further speculated that with only $1, subjects faced insufficient justification and therefore "cognitive dissonance", so when they were asked to lie about the tasks, they sought to relieve this hypothetical stress by changing their attitude. This process allows the subject to genuinely believe that the tasks were enjoyable.

Put simply, the experimenters concluded that many human beings, when persuaded to lie without being given sufficient justification, will carry out the task by convincing themselves of the falsehood, rather than telling a bald lie.
 
Last edited:
Your example about smoking neglects the issues of bodily drives. An intelligent, rational person doesn't necessarily do everything as a result of an intelligent, rational analysis.

Someone could grip your nuts in a vise, and you'd scream and try to pull away. That was emotion-driven, not rational.

Yes, you could claim it was intelligent and rational to do that behavior (after the fact, as an aside), but then wouldn't that apply to smoking? If it's rational to just follow the emotional drives?
 
Rather than conclude that we have wasted our time, which calls into question our intelligence, we instead resolve the dissonance by subconsciously overrating the usefulness or importance of what we have just listened to.

This doesn't necessarily track as CD. The assumption made is that the speech was worthless, or of limited value. But that may no necessarily be the case. Those who didn't have to put forth effort, but still had to listen to the speech were under the guise of a requirement to attend. Those who had to undergo the rigors to get in, would naturally attach a perceived worth. But that doesn't mean the information wasn't worthwhile.

Still, your point is taken. It's the same reason that many humane shelters charge adoption fees. They have found that the return/abandonment of shelter animals decreases if there is a perceived worth attached.
 
I'm continually surprised that cognitive dissonance theory is given so much credibility.

Complaining about doublethink I could understand, as that's a simple description of a person's beliefs/actions, but cognitive dissonance is a theory that makes claims; claims that are not falsifiable.

There's no way to gauge or measure cognitive dissonance without producing results that are more easily explained through more parsimonious theories, such as self perception:

Self-perception theory differs from cognitive dissonance theory in that it does not hold that people experience a "negative drive state" called "dissonance" which they seek to relieve. Instead, people simply infer their attitudes from their own behavior in the same way that an outside observer might. Self-perception theory is a special case of attribution theory.

Bem ran his own version of Festinger and Carlsmith's famous cognitive dissonance experiment. Subjects listened to a tape of a man enthusiastically describing a tedious peg-turning task. Some subjects were told that the man had been paid $20 for his testimonial and another group was told that he was paid $1. Those in the latter condition thought that the man must have enjoyed the task more than those in the $20 condition. Bem argued that the subjects did not judge the man's attitude in terms of cognitive dissonance phenomena, and that therefore any attitude change the man might have had in that situation was the result of the subject's own self-perception.

Whether cognitive dissonance or self-perception is a more useful theory is a topic of considerable controversy and a large body of literature, with no clear winner. There are some circumstances where either theory is preferred, but it is traditional to use the terminology of cognitive dissonance theory by default.

http://en.wikipedia.org/wiki/Self-perception_theory
 
Last edited:
When we begin a thought process by assuming we are rational beings we are committing our first irrationality.
 
Cognitive Dissonance is probably one of the greatest predictors of counter-intuitive human behavior. Once you understand it, you see it's effects everywhere: military training, Mormon mission trips (I'm not sure how much people are aware of the proccess, but mormon's send their disciples on 2 year missions to spread the world, practice chastity, and tour 3rd world crapholes, and come back brimming with the holy spirit), most woo woo stuff.
 

Back
Top Bottom