This is the third Gary Klein book your reviewer has read, and is the second one that she perceives to be a general assault on the love of rules and data-based decision making that permeates business life, government and much professional activity. Klein asserts he is not the enemy of such ideas, but that the boundaries at which they stop being any good (and actually get dangerous) are not that wide. And the areas where one needs something else are what he calls the shadows, rather than under the streetlight: complex situations that are disorderly, or unfamiliar, have ambiguous goals, lack observational data, or time, or clear feedback. According to Klein this covers the bulk of situations, akin to the sub-surface portion of an iceberg.
Received wisdom holds that procedures can substitute for and augment skill. So that workers can perform skilled tasks without years of practice. Choices should be made by setting out various options, and evaluating then comparing them; this will beget consistency and repeatability. More data is better than less before deciding anything. Inferences should be carefully drawn; conclusions should never be jumped to. All feedback is beneficial. Common ground (we are all on the same page) comes with setting rules and behaviours. In all of this there is a profound distrust of individual expertise and judgment, which exists but cannot be written down (because it resides in the shadows of memory and experience).
The problem—Klein argues convincingly—is how narrow is the range of situations for which these beliefs either work or offer practical help. There are voluminous procedures for air traffic controllers to use to deal with hijacking, but too many to read in time should an incident—usually unique in a number of respects--occur. Checklists and safeguards arguably created the situation when Captain Pearson, a Canadian pilot, ran out of fuel mid-flight between Ottawa and Edmonton in 1983, and procedures were not exactly what he used to land the “Gimli Glider” safely (by skidding at an angle along a runway for greater drag) since nobody had ever done that before. More mundanely, store detectives don’t have a rule book to decide which shoppers might be shoplifters and should thus get their attention. A racing expert can’t articulate all the reasons why she favours a given horse in a given race (well she can, but only via lengthy de-briefing well after the event).
Apparently these folks just know things. In fact they embody what Klein calls tacit knowledge: perception, pattern recognition, mental simulation, workarounds, and this goes beyond any qualifications or credentials. Any lawyer will have passed all relevant exams and be certified; but good ones using tacit knowledge can review a contract by imagining what can go wrong and create headaches, and mentally rotating clauses and conjuring up events that would be risks. People have knowledge that they can’t describe, and this fact reduces the market value of such knowledge inappropriately in professional space. In a very revealing yet pedestrian example, the author outlines how he and his wife were able to recall which Bergen hotel they had booked into, after arriving in Norway and realizing that neither of them had any conscious memory of which one it was, yet the knowledge was there (and extremely valuable to them in the circumstance)
Decision biases and heuristics, for which research psychologists have created a popular industry of revealing and bashing, are useful and not stuff to be eradicated, says Klein. They are not evidence of irrationality or defect, but effective strategies that have some limitations researchers can exploit. Most experiments in this realm demonstrate the limits of classical decision theory (there is not time, or data to use it), rather than the limitations of test subjects. And Klein claims that nobody has actually shown that decisions in the field are improved by ridding the makers of biases. Intuition based on emotion often has special pride-of-place for being slammed, but not many people would submit to imaginary (but medically feasible) surgery to destroy the part of their brain that translates emotions into the decision process.
Goals and objectives—long held sacred territory for corporate existence—can change, and require adaptation not adherence. This messes up many folks’ neat idea of how to run business and teams, but those people subscribe to unuseful fiction a lot of the time, Klein says. Even the supposed overall objective of a corporation (to make money for its owners) takes a back seat—more often than not—to survival, at least in the cases of successful businesses. Doesn’t look as great on paper, but Lehman Brothers went phut focusing on the wrong one of the two. In most competitive spheres, an impressive business/performance track record—which can quickly lead to lofty plans to be the best—can turn sour astonishingly quickly and re-focus attention on staying in the race. Hopefully, before it’s too late.
Effective analysis can also get thwarted by teamwork, rather than helped as is usually supposed. An exercise that presented military/intelligence teams with a situation that required anticipatory thinking, and which the researchers gradually fed in weak signals contradicting the most likely initial anticipation, found that although individuals on a team noted these, the dynamic of the team itself tended to suppress them. When individual analysis was used, the contradictory signals were more likely to surface. This seems to reveal group think, or lowest-common-denominator-ism.
Klein’s antidote, constructed through many closely examined examples (which themselves should dispel the flakiness with which many sceptics approach intuition) is—as per his other books—to big up judgment, to realize that a hunch from an expert is not a wild guess but often an experience based position that simply does not lend itself to standardized audit. Further: to actually hone and develop heuristics and biases, and to replace data gathering and option evaluation with pattern-recognition and less orderly “sense-making”. Experts, he says, should be viewed as detectors, not walking wikipedias: seeing and recognizing what others don’t, because of their experience. Those who aspire to expertise should understand what this means. And navigating through life (professional, personal, all of it) is better done in a recovery-oriented mindset, not a follow-the steps mindset, which also needs experience, adaptability, and depth and breadth of focus. Traditional methods of problem-solving and decision-making will—left unchecked—usurp brilliance and replace it with mediocrity. And the danger of that could be eventual collective forgetting of what the former ever was. That would be bad.
Received wisdom holds that procedures can substitute for and augment skill. So that workers can perform skilled tasks without years of practice. Choices should be made by setting out various options, and evaluating then comparing them; this will beget consistency and repeatability. More data is better than less before deciding anything. Inferences should be carefully drawn; conclusions should never be jumped to. All feedback is beneficial. Common ground (we are all on the same page) comes with setting rules and behaviours. In all of this there is a profound distrust of individual expertise and judgment, which exists but cannot be written down (because it resides in the shadows of memory and experience).
The problem—Klein argues convincingly—is how narrow is the range of situations for which these beliefs either work or offer practical help. There are voluminous procedures for air traffic controllers to use to deal with hijacking, but too many to read in time should an incident—usually unique in a number of respects--occur. Checklists and safeguards arguably created the situation when Captain Pearson, a Canadian pilot, ran out of fuel mid-flight between Ottawa and Edmonton in 1983, and procedures were not exactly what he used to land the “Gimli Glider” safely (by skidding at an angle along a runway for greater drag) since nobody had ever done that before. More mundanely, store detectives don’t have a rule book to decide which shoppers might be shoplifters and should thus get their attention. A racing expert can’t articulate all the reasons why she favours a given horse in a given race (well she can, but only via lengthy de-briefing well after the event).
Apparently these folks just know things. In fact they embody what Klein calls tacit knowledge: perception, pattern recognition, mental simulation, workarounds, and this goes beyond any qualifications or credentials. Any lawyer will have passed all relevant exams and be certified; but good ones using tacit knowledge can review a contract by imagining what can go wrong and create headaches, and mentally rotating clauses and conjuring up events that would be risks. People have knowledge that they can’t describe, and this fact reduces the market value of such knowledge inappropriately in professional space. In a very revealing yet pedestrian example, the author outlines how he and his wife were able to recall which Bergen hotel they had booked into, after arriving in Norway and realizing that neither of them had any conscious memory of which one it was, yet the knowledge was there (and extremely valuable to them in the circumstance)
Decision biases and heuristics, for which research psychologists have created a popular industry of revealing and bashing, are useful and not stuff to be eradicated, says Klein. They are not evidence of irrationality or defect, but effective strategies that have some limitations researchers can exploit. Most experiments in this realm demonstrate the limits of classical decision theory (there is not time, or data to use it), rather than the limitations of test subjects. And Klein claims that nobody has actually shown that decisions in the field are improved by ridding the makers of biases. Intuition based on emotion often has special pride-of-place for being slammed, but not many people would submit to imaginary (but medically feasible) surgery to destroy the part of their brain that translates emotions into the decision process.
Goals and objectives—long held sacred territory for corporate existence—can change, and require adaptation not adherence. This messes up many folks’ neat idea of how to run business and teams, but those people subscribe to unuseful fiction a lot of the time, Klein says. Even the supposed overall objective of a corporation (to make money for its owners) takes a back seat—more often than not—to survival, at least in the cases of successful businesses. Doesn’t look as great on paper, but Lehman Brothers went phut focusing on the wrong one of the two. In most competitive spheres, an impressive business/performance track record—which can quickly lead to lofty plans to be the best—can turn sour astonishingly quickly and re-focus attention on staying in the race. Hopefully, before it’s too late.
Effective analysis can also get thwarted by teamwork, rather than helped as is usually supposed. An exercise that presented military/intelligence teams with a situation that required anticipatory thinking, and which the researchers gradually fed in weak signals contradicting the most likely initial anticipation, found that although individuals on a team noted these, the dynamic of the team itself tended to suppress them. When individual analysis was used, the contradictory signals were more likely to surface. This seems to reveal group think, or lowest-common-denominator-ism.
Klein’s antidote, constructed through many closely examined examples (which themselves should dispel the flakiness with which many sceptics approach intuition) is—as per his other books—to big up judgment, to realize that a hunch from an expert is not a wild guess but often an experience based position that simply does not lend itself to standardized audit. Further: to actually hone and develop heuristics and biases, and to replace data gathering and option evaluation with pattern-recognition and less orderly “sense-making”. Experts, he says, should be viewed as detectors, not walking wikipedias: seeing and recognizing what others don’t, because of their experience. Those who aspire to expertise should understand what this means. And navigating through life (professional, personal, all of it) is better done in a recovery-oriented mindset, not a follow-the steps mindset, which also needs experience, adaptability, and depth and breadth of focus. Traditional methods of problem-solving and decision-making will—left unchecked—usurp brilliance and replace it with mediocrity. And the danger of that could be eventual collective forgetting of what the former ever was. That would be bad.