How Did Confirmation Bias Evolve?

I'm not really focusing on superstition.

My main curiosity is in the connection between ignoring evidence that contradicts your beliefs and stimulating the pleasure center. I'm ready to believe that a single genetic mutation resulted in a connection between these two groups of neurons, and that this bit of accidental neuron wiring was then selected for.

In pseudocode one would write this as follows:

FUNCTION Confirmation Bias(input,model)
IF(input doesn't agree with model)
THEN
discard input
stimulate pleasure center()
END

FUNCTION Stimulate Pleasure Center()
whatever you just did, do it again ASAP and with more gusto
END FUNCTION

Could some dyed-in-the-wool skeptics have a damaged confirmation bias gene?

I only ask. ;)

I think that I didn't make my point very well, the brain acts as an associative filter that is heavy into pattern creation and pattern recognition. I would say that confirmation bias is just an offshoot of an otherwise healthy process. The benefit of seeing patterns is hard wired and soft programmed into humans, perceptions are also generated from the sensations to fill in missing sensation data. the ability of the brain to recognise patterns is crucial to survival and reproductive success. The ability to see plants and animals is very important, the ability to recognise places where animals and plants are found is very useful. The same is true of any resources and the reverse is true for aversive situations. So the ability to see patterns is part of being human. It would stand to reason that there would be the possibility false positives. Sp patterns might be seen which have low external validity.

Superstition has a very specific meaning in behaviorism.
http://www.psychology.uiowa.edu/Faculty/wasserman/Glossary/Superstitious behavior.html

Where is the data that says confirmation bias being confronted is linked to pleasurable sensations? I missed it.
 
Last edited:
I'm not really focusing on superstition.

My main curiosity is in the connection between ignoring evidence that contradicts your beliefs and stimulating the pleasure center. I'm ready to believe that a single genetic mutation resulted in a connection between these two groups of neurons, and that this bit of accidental neuron wiring was then selected for.

In pseudocode one would write this as follows:

FUNCTION Confirmation Bias(input,model)
IF(input doesn't agree with model)
THEN
discard input
stimulate pleasure center()
END

FUNCTION Stimulate Pleasure Center()
whatever you just did, do it again ASAP and with more gusto
END FUNCTION

That's where the disconnect happened: what you're describing above is not confirmation bias - it's wilful ignorance. They're not the same.

I'll give you an example of confirmation bias:

Hypothesis: all apples are safe to eat.
Fact: I ate one
Fact: I didn't get sick or die
Conclusion: hypothesis supported

Confirmation bias has occured.

Really, all we know is that *at least one* apple is safe to eat. There is no evidence that other apples are unsafe at this time, but neither have we invested in further exploration of this possibility by sampling more available apples to see if the original sample was too small.

This is very much the way we operate on a day-to-day basis, and is only important when dealing with situations where the chance of confirmation bias revealing an incorrect conclusion is mission-critical. In the example above, if we felt that apples had a lot of variation, confirmation bias would be a bigger problem than if we already understand that they're pretty uniform. One may actually be enough to represent the category, and confirmation bias is not a big problem.

Confirmation bias is not the *rejection* of contrary information - it's the decision not to pursue it. It's the probably the major cause of Type I Error.





Could some dyed-in-the-wool skeptics have a damaged confirmation bias gene?

As mentioned above, you have made a huge leap to use the expression 'gene' for this. Before moving to such a nonsequitur, i'd start by seeing if the property is independently inherited, as opposed to an artefact of our general intelligence.
 
From Richard Dawkins' recent TV programme The Enemies of Reason:

Even in the 21st century, despite all that science has revealed about the indifferent vastness of the universe, the human mind is a wanton storyteller, creating intention in the randomness of reality. The delivery of rewards by a one-armed bandit is determined at random, but many gamblers want to think that what they do can increase their chances of winning the jackpot. They stand on one leg, or wear a lucky shirt. Are these superstitious behaviours a byproduct of our evolution?

All wild animals have to be kind of natural statisticians, looking for patterns in the apparent randomness of nature, when they're looking for food or trying to avoid predators. There are two kinds of mistake they can make: they can either fail to detect pattern when there is some; or they can seem to detect pattern when there isn't any - and that's superstition.

Spotting a flash of yellow in the long grass, filling in the rest from your imagination, thinking "lion" and running like hell, wastes energy if it's not really a lion. Not spotting the flash of yellow and running like hell gets you eaten, if it is really a lion. So the first kind of mistake has far worse consequences than the second kind. So natural selection favoured those who saw patterns that weren't there, over those who didn't see patterns that were there.
 
I think we may be missing a deeper aspect to Confirmation Bias (and similar psychological effects). It's main aim isn't per se to just keep on believing an old idea in the face of new conflicting evidence, but more fundamentally a mechanism for incorporating new facts into an old worldview.

In other words, Confirmation Bias isn't just a yes/no process where someone either accepts a fact or not. Rather, it is part of the all-too-common process of converting ideas we find psychologically threatening into something that does not threaten us or even which supports our overall worldview. Note that this doesn't necessarily mean the new fact is just discarded. Far more likely is that the new fact will be kept, but reinterpreted in such a way as to fit the old worldview.

Take this as an example of what I mean (and one that goes beyond just Confirmation Bias):

Let's say that I believe that Homeopathy actually works. I believe it so much that finding out that I was wrong would be a severe psychological blow -- maybe my belief is the basis for many of my friendships or even career, for example. At the very least I don't want to be shown as gullible or stupid.

So, when I conduct an otherwise scientific experiment to (from my p.o.v.) confirm Homeopathy, but come back with results I don't like my first reaction isn't going to be to give up my belief. Remember, I am already confident that Homeopathy is real so I am no more inclined to quickly accept that isn't the case than a skeptic would be that the Moon is made of cheese. In fact, my natural (and unconscious) reaction is to find some way to fit the result into my current belief system in a way that does not damage it if not actually strengthen it.

Maybe I do this is a more straightforward paranoid way -- my detractors must have polluted my data! It's a conspiracy! However, more likely I will just tend to unwittingly massage the data. Any hits I tend to give more credience -- "The person who wrote down that data is a better and more accurate worker" -- while the many misses get downplayed -- "Those patients aren't good at taking their medicine." Even more subtly, I might very well get indications that Homeopathy really does "work", but fail to notice more likely explanations such as the Homeopathy cures that "worked" were the ones for either minor ailments that tend to clear up anyway or which were given along with more conventional treatments or it was just the Placebo Effect.

And, all that assumes I didn't put any bias into the study in the first place! There are all sorts of ways I could unconsciously influence my circumstances to guarantee I confirm my bias. Just sit down for a while and I am sure you can think up many more examples.

My point is that Confirmation Bias (and logical fallacies in general) don't really make complete sense (IMHO) unless one sees them as ultimately all-too-human ways to not just prevent psychological angst and disappointment, but more importantly to build and maintain a coherent mental worldview. Without that a person has no sense of themselves and thus no drive to do anything.

Defending yourself from such damage has a clear evolutionary advantage -- how well would our ancestors have succeeded if they fell to pieces psychologically in the face of the many difficulties they doubtlessly struggled through? They needed to maintain their overall confidence and belief in success (in the broadest sense). Same is true now -- You aren't likely to accomplish very much if you think you're doomed to failure or, even more fundamentally, if you lack any sense of self to do anything.

Of course, the obvious problem is that this sort of "mental defense mechanism" all-too-often goes too far and "defends" us from information that is ultimately beneficial to us.
 
I think it's more important for survival pre culture that we identify correlations versus causes. The most basic form of learning-- classical conditioning-- is all about discovering things in the environment that correlate with biologically significant events. Once the correlation is present, we respond with a reflex. This seems about as basic as it gets and the survival value seems obvious.

It doesn't matter whether the bell actually causes food to magically appears. From the dog's point of view, the bell being a reliable signal for food (even though not causal) is all that's needed (plus some mechanism for extinction, when and if the bell no longer correlates with the presence of food).

I'd bet that confirmation bias and confusing correlation with cause are the two most frequently committed fallacies.

Maybe one causes the other? Still looking to prove that...
 
Also confirmation bias is the fallacy of affirming the consequent, which "makes sense". The opposite would be modus tolens, which I think is less intuitive.
 
It is important to remember that evolution does not produce perfect organisms, only adequate ones.
 
Also confirmation bias is the fallacy of affirming the consequent, which "makes sense". The opposite would be modus tolens, which I think is less intuitive.

What? Unless I'm missing your meaning, confirmation bias is when you only take those cases which agree with your belief, and ignore those which do not. Affirming the consequent is a formal logical fallacy relating to the rules of logic.
 
What? Unless I'm missing your meaning, confirmation bias is when you only take those cases which agree with your belief, and ignore those which do not. Affirming the consequent is a formal logical fallacy relating to the rules of logic.

Yeah, I think the confirmation bias implies we seek to prove theories (ac) versus falsify em (mt).

Smoking causes cancer.

If so, smokers should die sooner than non smokers.

They do!

See, smoking causes cancer.

I think it's the allure of the above fallacy that makes confirmation bias so common.
 
I think it's more important for survival pre culture that we identify correlations versus causes. The most basic form of learning-- classical conditioning-- is all about discovering things in the environment that correlate with biologically significant events. Once the correlation is present, we respond with a reflex. This seems about as basic as it gets and the survival value seems obvious.

It doesn't matter whether the bell actually causes food to magically appears. From the dog's point of view, the bell being a reliable signal for food (even though not causal) is all that's needed (plus some mechanism for extinction, when and if the bell no longer correlates with the presence of food).

I'd bet that confirmation bias and confusing correlation with cause are the two most frequently committed fallacies.

Maybe one causes the other? Still looking to prove that...

Noncontingent reinforcement can produce superstitious people, pigeons and rats. Intermittant reinforcement can produce superstitions that are extremely resistant to extinction. Add that to selective memory for hits and the sharpening and leveling of our internal discourse about past events, la temps perdu.
 
Yeah, I think the confirmation bias implies we seek to prove theories (ac) versus falsify em (mt).

Smoking causes cancer.

If so, smokers should die sooner than non smokers.

They do!

See, smoking causes cancer.

I think it's the allure of the above fallacy that makes confirmation bias so common.

But affirming the consequent is a different fallacy then confirmation bias. They are often associated, but are not to be equated. Your example is an example of both affirming the consequent and confirmation bias.
 
But affirming the consequent is a different fallacy then confirmation bias. They are often associated, but are not to be equated. Your example is an example of both affirming the consequent and confirmation bias.

I think they're closely linked. Maybe JC can share some data on the frequency of AC versus MT using the Wason card task?
 
I think they're closely linked. Maybe JC can share some data on the frequency of AC versus MT using the Wason card task?

I disagree on the fundamental basis that they are different fallacies. One deals with data (confirmation bias, i.e. only choosing supporting data and ignoring all others), and the other deals with a logical argument. You can have one and the other, or any variation thereof.
 
Confirmation Bias is one of my research projects right now. After our discussion on it in the thread "Why the militant atheism?" I
Woo-Me argues forcefully that he is right and that the paw prints are just the sky spirit's trick to confuse us. Woo-Me is so emphatic and persuasive, because of the pleasure he experiences from ignoring ran-De's evidence, that the clan believes him and banishes ran-De. Babies continue to disappear and the colony becomes extinct, along with the confirmation bias gene.

So, why is the confirmation bias gene still around?

(After I started this I thought of an answer, but still want to see the discussion play out here untainted by my own confirmation bias. I've written what I think is the answer into a text file and will paste it here after the discussion approaches maturity.)

I don't really get the example. There are two alternative explanations presented, and the implication that confirmation bias explains the incorrect one. But confirmation bias may reinforce correct as well as incorrect explanations. The difference between the two explanations is that one is natural and one supernatural; both may depend on bias. In this example, the belief that a dingo takes the baby may in fact be based on bias for remembering cases where there were signs of a dingo and neglecting those cases where there were none. It just so happens that it is more likely to be correct in this case. There is no reason to think that the majority would not infer causation from correlation in the case of the dingo's footprints regardless of whether they also construct supernatural explanations.

The cost of being too reluctant to believe something when it is in fact true, may well be higher in general than the costs of being willing to believe something when it is in fact false. Where there is correlation, there will frequently be causation. Failure to infer causation when it exists will frequently have dire consequences. Incorrectly inferring causation and maintaining the belief through confirmation bias is likely to result in no more than some wasted energy or opportunities, and in many cases the activities that result may serve a social function that compensates for their costs.
 
I think it's more important for survival pre culture that we identify correlations versus causes. The most basic form of learning-- classical conditioning-- is all about discovering things in the environment that correlate with biologically significant events. Once the correlation is present, we respond with a reflex. This seems about as basic as it gets and the survival value seems obvious.

It doesn't matter whether the bell actually causes food to magically appears. From the dog's point of view, the bell being a reliable signal for food (even though not causal) is all that's needed (plus some mechanism for extinction, when and if the bell no longer correlates with the presence of food).

I'd bet that confirmation bias and confusing correlation with cause are the two most frequently committed fallacies.

Maybe one causes the other? Still looking to prove that...


Add to that that variable reinforcement is a very strong reinforcement. If you ring the bell on a variable schedule per food event then you get an even stronger reinforcement than if you ring the bell every time.
 
There is also the important human fallacy of determinism to be avoided in approaching evolution. It is almost impossible because of contingent history to ask "why did this trait evolve?", we can ask "what is this trait possibly associated with the other trait that might benefit reproductive success?"

For example asking "Why did humans evolve intelligence?" is not as useful a strategy as asking "What led to humans upright gait and infant neotany?"
 
Last edited:
Add to that that variable reinforcement is a very strong reinforcement. If you ring the bell on a variable schedule per food event then you get an even stronger reinforcement than if you ring the bell every time.
I think you are confusing respondent and operant conditioning here. Pairing a bell with food is Pavlovian respondent conditioning.
Intermittent reinforcement in operant conditioning is when reinforcement does not follow every response. A slot machine pays off on a variable ratio schedule and produces behavior that is difficult to extinguish. This is probably relevant to confirmation bias. Looking for data to confirm some hypothesis pays off enough of the time to maintain that strategy.
Just as it is difficult to convince people that the correct solution to the Wason card problem is to pick the card that could disprove the statement that "All cards with a vowel on one side have an even number on the other", it is difficult to convey Popper's emphasis on falsifiability.
And BPesta, "AC vs MT" is too cryptic for me right now. Maybe after two more cups of coffee...
 
I think you are confusing respondent and operant conditioning here. Pairing a bell with food is Pavlovian respondent conditioning.
Intermittent reinforcement in operant conditioning is when reinforcement does not follow every response. A slot machine pays off on a variable ratio schedule and produces behavior that is difficult to extinguish. This is probably relevant to confirmation bias. Looking for data to confirm some hypothesis pays off enough of the time to maintain that strategy.
Just as it is difficult to convince people that the correct solution to the Wason card problem is to pick the card that could disprove the statement that "All cards with a vowel on one side have an even number on the other", it is difficult to convey Popper's emphasis on falsifiability.
And BPesta, "AC vs MT" is too cryptic for me right now. Maybe after two more cups of coffee...

Sorry i took most of my classes in 76-79 and some more 84-86, my terminology is horrible.
 
I think it's this:

If a card has a vowel on the front, it has an even number on the back:
if p then q


Cards

a (a vowel-- p; modus ponens, you should flip this one)
b (a consonant, not p; denying the antecedent; dont flip it)
1 (odd number, not q; modus tollens; flip it)
2 (even number; q, affirming the consequent, don't flip)


My point was that the fallacy AC is more intuitive / "makes sense" relative to modus tollens. I think this is an important aspect of confirmation bias (I know the literature calls it that).

I was wondering if you had knowledge on the % of people who flip the AC card versus those who filp the MT card.
 
It's not a huge deal, I would concede the point, but I wonder if every instance of confirmation bias is logically the same as the human tendancy to flip the AC card but not the MT card (i.e., look to confirm whether than falsify)
 

Back
Top Bottom