The Kahneman and Tsversky collaboration produced some of the most interesting and consequential work in the social sciences of the past century. Michael Lewis is an excellent author, and told their story with enjoyable and exhaustive narrative. The reader gets to learn not just about what their psychological discoveries were, but how and why they came to be.
I found Michael Lewis’s framing of their psychological work particularly interesting. In Moneyball, the focus is on how decision making can be made perfect and logical. In Kahneman and Tsversky’s work, the focus is on why decision making is not that way.
The answer, of course, is that we are bad at thinking. We have cognitive biases. These phenomena should be learned by everyone. I really enjoyed the book, and would recommend it as well as Kahneman’s own book Thinking, Fast and Slow to all.
- Complaining as a social phenomenon. Amos remarked that he “thought animals had more real pain than people and complained a lot less.” I know this was an off-hand witticism more than a deep statement of psychology or philosophy, but this made me think about the nature of complaining. People complain because we are extremely social creatures, and it could be the case that we can get other people to solve our problems. Animals are less social, and few cows, snakes, or fish are going to get another one to help it out of a situation of distress. Animals are more stoic by the nature of their evolution
- Multidisciplinary studies is good. Hebrew University in the late 1950s required students to pick two fields of concentration. This is a good practice, American universities should be cross-disciplinary study like this. People need some sort of generality in their education. Specialization is for insects. Applies also to Scott Adams’s mental model of skill stacks (or talent stacks).
- Make people empathize with each other by framing context to stress similarities. “For instance, if you wanted two people to think of themselves as more similar to each other than they otherwise might, you might put them in a context that stressed the features they shared. Two American college students in the United States might look at each other and see a total stranger; the same two college students on their junior year abroad in Togo might find that they are surprisingly similar: They’re both Americans! By changing the context in which two things are compared, you submerge certain features and force others to the surface.”
- Mental models are limited but often better than man’s intuition for decision making. “The implications were vast. ‘If these findings can be generalized to other sorts of judgmental problems,’ Goldberg wrote, ‘it would appear that only rarely—if at all—will the utilities favor the continued employment of man over a model of man.'”
- Danny and Amos pioneered one of my favorite terms: heuristics. “These rules of thumb Danny and Amos called ‘heuristics.’ And the first heuristic they wanted to explore they called ‘representativeness.'”
- The problem with displaying probabilities and statistics is that people will still fail to use them appropriately. “What was the point of laying out the odds of a gamble, if the person taking it either didn’t believe the numbers or didn’t want to know them? The trouble, Danny suspected, was that ‘the understanding of numbers is so weak that they don’t communicate anything. Everyone feels that those probabilities are not real—that they are just something on somebody’s mind.'”
- Judgment assigns values, which guides decisions. “The distinction between judgment and decision making appeared as fuzzy as the distinction between judgment and prediction. But to Amos, as to other mathematical psychologists, they were distinct fields of inquiry. A person making a judgment was assigning odds. How likely is it that that guy will be a good NBA player? How risky is that triple-A-rated subprime mortgage–backed CDO? Is the shadow on the X-ray cancer? Not every judgment is followed by a decision, but every decision implies some judgment.”
- When we make decisions, we often do not seek to maximize utility, but rather to minimize regret. But this is not the whole story, and Danny and Amos walked away from regret as a useful heuristic. As they sorted through the implications of their new discovery, one thing was instantly clear: Regret had to go, at least as a theory. It might explain why people made seemingly irrational decisions to accept a sure thing over a gamble with a far greater expected value. It could not explain why people facing losses became risk seeking. Anyone who wanted to argue that regret explains why people prefer a certain $500 to an equal chance to get $0 and $1,000 would never be able to explain why, if you simply subtracted $1,000 from all the numbers and turned the sure thing into a $500 loss, people would prefer the gamble. Amazingly, Danny and Amos did not so much as pause to mourn the loss of a theory they’d spent more than a year working on. The speed with which they simply walked away from their ideas about regret—many of them obviously true and valuable—was incredible.
- One may be able to beat the stock market by capitalizing on systematic mistakes of others. Wherever economic theory uses homo economicus, then there must be some interesting psychological bias at play. “If people could be systematically wrong, their mistakes couldn’t be ignored. The irrational behavior of the few would not be offset by the rational behavior of the many. People could be systematically wrong, and so markets could be systematically wrong, too.”
- Time heals psychological wounds. “Another, related, rule was that ‘an event becomes gradually less changeable as it recedes into the past.’ With the passage of time, the consequences of any event accumulated, and left more to undo. And the more there is to undo, the less likely the mind is to even try. This was perhaps one way time heals wounds, by making them feel less avoidable.”
- A prediction can be made to seem more believable, even as it became less likely, if it was filled with internally consistent details. “Any lawyer could at once make a case seem more persuasive, even as he made the truth of it less likely, by adding “representative” details to his description of people and events.”
A Few Heuristics:
- Representativeness: the more similar the specific case is to the notion in your head, the more likely you are to believe that the case belongs. “‘Our thesis,’ they wrote, ‘is that, in many situations, an event A is judged to be more probable than an event B whenever A appears more representative than B.’ The more the basketball player resembles your mental model of an NBA player, the more likely you will think him to be an NBA player. They had a hunch that people, when they formed judgments, weren’t just making random mistakes—that they were doing something systematically wrong.”
- Availability Heuristic: that which is memorable is that which influences thinking. “Amos and Danny wrote, ‘the use of the availability heuristic leads to systematic biases.’ Human judgment was distorted by . . . the memorable.”
- Conditionality Heuristic: the tendancy for people to make unstated assumptions. “‘In assessing the profit of a given company, for example, people tend to assume normal operating conditions and make their estimates contingent upon that assumption,’ they wrote in their notes. ‘They do not incorporate into their estimates the possibility that these conditions may be drastically changed because of a war, sabotage, depressions, or a major competitor being forced out of business.'”
- Anchoring and Adjustment Heuristic: hearing a number anchors you towards it, even if it is unrelated to the subject at hand. People could be anchored with information that was totally irrelevant to the problem they were being asked to solve. For instance, Danny and Amos asked their subjects to spin a wheel of fortune with slots on it that were numbered 0 through 100. Then they asked the subjects to estimate the percentage of African countries in the United Nations. The people who spun a higher number on the wheel tended to guess that a higher percentage of the United Nations consisted of African countries than did those for whom the needle landed on a lower number.
- Hindsight Bias: the tendency to think something was more predictable after the fact. “Once they knew the outcome, they thought it had been far more predictable than they had found it to be before, when they had tried to predict it.”
- Selective Matching: the tendency to acknowledge evidence when it confirms a hypothesis, but ignore it when it does not. “For arthritis, selective matching leads people to look for changes in the weather when they experience increased pain, and pay little attention to the weather when their pain is stable. . . . [A] single day of severe pain and extreme weather might sustain a lifetime of belief in a relation between them.”
- Peak-End Rule: people’s memory of an experience is predicated on the peak pleasure or pain, as well as that they felt at the end of the experience. “If you stuck people’s arms in ice buckets for three minutes but warmed the water just a bit for another minute or so before allowing them to flee the lab, they remembered the experience more fondly than if you stuck their arms in the bucket for three minutes and removed them at a moment of maximum misery…”
- The Endowment Effect: when people attach extra value to whatever they happened to own, simply because they owned it. People are surprisingly reluctant to part with their possessions, or endowments, even when trading them made economic sense.
- Framing: by changing the description of a situation, and making a gain seem like a loss, you can cause people to flip their attitude toward risk, and turn them from risk avoiding to risk seeking. “We invented framing without realizing we were inventing framing,” said Danny. “You take two things that should be identical—the way they differ should be irrelevant—and by showing it isn’t irrelevant, you show that expected utility theory is wrong.”
Various Clever Remarks and Aphorisms:
- “You know, Murray [Gell-Mann], there is no one in the world who is as smart as you think you are.”
- “The nice thing about things that are urgent,” he liked to say, “is that if you wait long enough they aren’t urgent anymore.”
- “People predict very little and explain everything.”
- “Last impressions can be lasting impressions.”
- “People did not choose between things. They chose between descriptions of things.”
- “The problem,” says Harvard social psychologist Amy Cuddy, “is that psychologists think economists are immoral and economists think psychologists are stupid.”