More than a bit reminiscent of "Being Wrong" (by Kathryn Schultz) to this reviewer, but predating it by a while, this book offers a theory for why mistakes get not just made but entrenched. Or rather, the incidence of mistake itself is not much touched, as if it were a random outcome. But the investment of commitment and ego in it is. The bottom line is that if something goes wrong, it's going to stay wrong.
Which is bad. If an innocent person is believed to be guilty of a crime (by the appropriate folks), the window quickly closes through which they have any chance of not being convicted. Once people have a prejudice, they do not easily drop it.
So far, so bad. But there starts a chain reaction, or ratchet effect of behavior and subsequent self-justification, whereby beliefs and actions diverge from rational. And they self-reinforce, rather than self-correct. Confronted with disconfirming evidence for what you think, one would suppose that the path of least internal (cognitive) resistance is to change your mind. Apparently it is the reverse in too many cases, and changing your observed perception—to retain the belief—is easier. "I was wrong" is far more difficult to accept than "the new data is wrong". Perhaps the most stark illustration of this mechanism lies in the recounted stories of "recovered memory" (a discredited practice that still has powerful adherents)—where fake beliefs about a past life on the receiving end of abuse can seemingly be manufactured out of thin air.
This is adaptation gone awry. It is useful in some circumstances. When falling on hard times it is of benefit to fade or alter memory of how good things were before. In the authors’ words you can get what you want by revising what you had. But Tavris and Aronson are very focused on detrimental effects of the same property. And it is innate—so there is no ready off-switch to throw in situations where memory, facts and observations should be accessed with impartiality.
This is most nefarious in what are known as closed loops—private interactions free of peer scrutiny, with no market testing, and where scepticism is optional (which it usually is). Training and experience is not an escape route from this either: rather than increase accuracy, these mostly merely increase confidence in one’s accuracy, which is a step in the opposite direction away from objectivity. Coming to a tentative conclusion and then having the doubt systematically sucked out spells disaster in medicine, psychotherapy and criminal justice—some of the areas the book majors on.
Not touched on but of relevance to this reviewer’s field of investing is the analogous technique of getting more information to substantiate a prior conclusion. Information can often be interpreted either way and bent to support beliefs—particularly if these are expectational about market prices. Small wonder then, that the accumulation of more of it does not correct faulty conclusions so much as reinforce conviction in what they were already. And your reviewer would like to point out that even the cold truth of subsequent investment performance can have difficulty in persuading her that she was wrong. Sometimes.
What is the real cause of this "jumping to convictions"? Preference for certainty over science (criminal trials). The (far) greater "deservability" of harm inflicted on others relative to having it done unto oneself (see Abu Ghraib, Milgram’s 37—which is more about self-justification than obedience to authority). And the drug of alleviated dissonance.
All this seems fearfully terminal. It isn't possible to critique conventional thought processes the way this does without being introspective at the end which the authors do (thus making the book look a teeny bit like self-help, which it isn't really). The carrot offered is the joy / relief / release at hearing "I screwed up" (as in President Kennedy and the Bay of Pigs, General Lee at Gettysburg). And, reflexively, the position of being on the receiving end of such sentiment and gratitude if one practices this oneself.
But this does not seem to happen very much, presumably because of the high hurdle that needs to be overcome to embrace dissonance and come out the other side. Perhaps more and repeated experience can raise its discount rate and make the step easier. In collective settings, turning up the lights of scrutiny—twinned with the acceptance of error as acknowledgement of the positive ("Hey I succeeded in identifying a blind alley that I now know not to travel down")—can open the closed loop. The authors do not appear to hold much confidence that learning from error is about to become less unpopular. But that should probably designate that behavior as an opportunity in an inefficient market.
Which is bad. If an innocent person is believed to be guilty of a crime (by the appropriate folks), the window quickly closes through which they have any chance of not being convicted. Once people have a prejudice, they do not easily drop it.
So far, so bad. But there starts a chain reaction, or ratchet effect of behavior and subsequent self-justification, whereby beliefs and actions diverge from rational. And they self-reinforce, rather than self-correct. Confronted with disconfirming evidence for what you think, one would suppose that the path of least internal (cognitive) resistance is to change your mind. Apparently it is the reverse in too many cases, and changing your observed perception—to retain the belief—is easier. "I was wrong" is far more difficult to accept than "the new data is wrong". Perhaps the most stark illustration of this mechanism lies in the recounted stories of "recovered memory" (a discredited practice that still has powerful adherents)—where fake beliefs about a past life on the receiving end of abuse can seemingly be manufactured out of thin air.
This is adaptation gone awry. It is useful in some circumstances. When falling on hard times it is of benefit to fade or alter memory of how good things were before. In the authors’ words you can get what you want by revising what you had. But Tavris and Aronson are very focused on detrimental effects of the same property. And it is innate—so there is no ready off-switch to throw in situations where memory, facts and observations should be accessed with impartiality.
This is most nefarious in what are known as closed loops—private interactions free of peer scrutiny, with no market testing, and where scepticism is optional (which it usually is). Training and experience is not an escape route from this either: rather than increase accuracy, these mostly merely increase confidence in one’s accuracy, which is a step in the opposite direction away from objectivity. Coming to a tentative conclusion and then having the doubt systematically sucked out spells disaster in medicine, psychotherapy and criminal justice—some of the areas the book majors on.
Not touched on but of relevance to this reviewer’s field of investing is the analogous technique of getting more information to substantiate a prior conclusion. Information can often be interpreted either way and bent to support beliefs—particularly if these are expectational about market prices. Small wonder then, that the accumulation of more of it does not correct faulty conclusions so much as reinforce conviction in what they were already. And your reviewer would like to point out that even the cold truth of subsequent investment performance can have difficulty in persuading her that she was wrong. Sometimes.
What is the real cause of this "jumping to convictions"? Preference for certainty over science (criminal trials). The (far) greater "deservability" of harm inflicted on others relative to having it done unto oneself (see Abu Ghraib, Milgram’s 37—which is more about self-justification than obedience to authority). And the drug of alleviated dissonance.
All this seems fearfully terminal. It isn't possible to critique conventional thought processes the way this does without being introspective at the end which the authors do (thus making the book look a teeny bit like self-help, which it isn't really). The carrot offered is the joy / relief / release at hearing "I screwed up" (as in President Kennedy and the Bay of Pigs, General Lee at Gettysburg). And, reflexively, the position of being on the receiving end of such sentiment and gratitude if one practices this oneself.
But this does not seem to happen very much, presumably because of the high hurdle that needs to be overcome to embrace dissonance and come out the other side. Perhaps more and repeated experience can raise its discount rate and make the step easier. In collective settings, turning up the lights of scrutiny—twinned with the acceptance of error as acknowledgement of the positive ("Hey I succeeded in identifying a blind alley that I now know not to travel down")—can open the closed loop. The authors do not appear to hold much confidence that learning from error is about to become less unpopular. But that should probably designate that behavior as an opportunity in an inefficient market.
Last edited: