Tuesday, October 25, 2011

Nobelist Daniel Kahneman: "Experts may be in the grip of an illusion"

The field of behavioral economics, basically unknown thirty years ago, has had a profound influence on management, negotiations, marketing and many other fields [sadly, it seems to have had less influence on economics, a subject that sorely needs some shaking up]. The dean of behavioral economics is Daniel Kahneman, recipient of the Nobel Memorial Prize for Economics in 2002 along with Vernon L. Smith.

Behavioral economics upends traditional economic thinking by asserting that people are not rational actors - instead they are composed of biases, blind spots, evolutionary holdovers and other components that get in the way of logical thinking. As a result, we act in ways that seem perfectly sensible to us, but bewilderingly mysterious to others. These defects (or so Mr. Spock would call them) contribute to both our making mistakes and failing to learn from them.

Kahneman has a new book, "Thinking, Fast and Slow," which was excerpted in the New York Times Magazine. In the excerpt, Kahneman relates a story from his long-ago assignment with the Israeli military, in which his team of psychologists were unable to predict leadership qualities based on a field test designed to establish exactly that. Notice below that true expertise involves humility - recognizing mistakes quickly and absorbing those lessons, again and again. [There's a simple name for this process: "experience."]


We often interact with professionals who exercise their judgment with evident confidence, sometimes priding themselves on the power of their intuition. In a world rife with illusions of validity and skill, can we trust them? How do we distinguish the justified confidence of experts from the sincere overconfidence of professionals who do not know they are out of their depth? We can believe an expert who admits uncertainty but cannot take expressions of high confidence at face value. As I first learned on the obstacle field, people come up with coherent stories and confident predictions even when they know little or nothing. Overconfidence arises because people are often blind to their own blindness.

True intuitive expertise is learned from prolonged experience with good feedback on mistakes. You are probably an expert in guessing your spouse's mood from one word on the telephone; chess players find a strong move in a single glance at a complex position; and true legends of instant diagnoses are common among physicians. To know whether you can trust a particular intuitive judgment, there are two questions you should ask: Is the environment in which the judgment is made sufficiently regular to enable predictions from the available evidence? The answer is yes for diagnosticians, no for stock pickers. Do the professionals have an adequate opportunity to learn the cues and the regularities? The answer here depends on the professionals' experience and on the quality and speed with which they discover their mistakes. Anesthesiologists have a better chance to develop intuitions than radiologists do. Many of the professionals we encounter easily pass both tests, and their off-the-cuff judgments deserve to be taken seriously. In general, however, you should not take assertive and confident people at their own evaluation unless you have independent reason to believe that they know what they are talking about. Unfortunately, this advice is difficult to follow: overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.

No comments:

Post a Comment