Shane Parrish’s website Farnam Street is one of my favorite on the internet. His collection of mental models is large, clear, and concise. It is the inspiration for me to have begun collecting mental models. When I found out he was writing a book—or, more precisely, a series of books—I was ecstatic, and preordered this months ahead of time.
It was good. He spoke through a number of the important mental models, cognitive distortions, and more. I took many useful notes. I found his organization of writing unique in that he introduced all the characters up front at the beginning of each chapter, and this was a novel technique I appreciated. If I were to find fault in the book, it would be that some of his examples don’t fully represent the models he is trying to illustrate. Still, they worked well enough.
- Avoiding problems is better than solving them. “The author and explorer of mental models, Peter Bevelin, put it best: ‘I don’t want to be a great problem solver. I want to avoid problems—prevent them from happening and doing it right from the beginning.'”
- Failure to convert bad intuition to rational decision-making is due to perspective, ego, and distance. “Our failures to update from interacting with reality spring primarily from three things:… The first flaw is perspective. We have a hard time seeing any system that we are in… The second flaw is ego. Many of us tend to have too much invested in our opinions of ourselves to see the world’s feedback—the feedback we need to update our beliefs about reality… The third flaw is distance. The further we are from the results of our decisions, the easier it is to keep our current views rather than update them. When you put your hand on a hot stove, you quickly learn the natural consequence. You pay the price for your mistakes. Since you are a pain-avoiding creature, you update your view. Before you touch another stove, you check to see if it’s hot. But you don’t just learn a micro lesson that applies in one situation. Instead, you draw a general abstraction, one that tells you to check before touching anything that could potentially be hot. Organizations over a certain size often remove us from the direct consequences of our decisions. When we make decisions that other people carry out, we are one or more levels removed and may not immediately be able to update our understanding.”
- “As Confucius said, ‘A man who has committed a mistake and doesn’t correct it, is committing another mistake.'”
- No model is perfect. “The map of reality is not reality. Even the best maps are imperfect. That’s because they are reductions of what they represent. If a map were to represent the territory with perfect fidelity, it would no longer be a reduction and thus would no longer be useful to us… The truth is, the only way we can navigate the complexity of reality is through some sort of abstraction. When we read the news, we’re consuming abstractions created by other people. The authors consumed vast amounts of information, reflected upon it, and drew some abstractions and conclusions that they share with us. But something is lost in the process. We can lose the specific and relevant details that were distilled into an abstraction.”
- Falsifiability. “Karl Popper wrote “A theory is part of empirical science if and only if it conflicts with possible experiences13 and is therefore in principle falsifiable by experience.” The idea here is that if you can’t prove something wrong, you can’t really prove it right either. Thus, in Popper’s words, science requires testability: “If observation shows that the predicted effect is definitely absent, then the theory is simply refuted.” This means a good theory must have an element of risk to it—namely, it has to risk being wrong. It must be able to be proven wrong under stated conditions.”
- Intersubjective collective fictions. “When it comes down to it, everything that is not a law of nature is just a shared belief. Money is a shared belief. So is a border. So are bitcoin. So is love. The list goes on.”
- Chaotic versus non-chaotic systems. “Let’s think about another chaotic system we’re all familiar with, the weather. Why is it that we can predict the movement of the stars but we can’t predict the weather more than a few weeks out, and even that is not altogether reliable? It’s because weather is highly chaotic. Any infinitesimally small error in our calculations today will change the result down the line, as rapid feedback loops occur throughout time. Since our measurement tools are not infinitely accurate, and never will be, we are stuck with the unpredictability of chaotic systems.”
- One can use the “veil of ignorance” to derive fairness. Our initial intuition of what is fair is likely to be challenged. “When confronted with the question of how best to organize society, we have this general feeling that it should be fair. But what exactly does this mean? We can use this thought experiment to test the likely outcomes of different rules and structures to come up with an aggregate of “most fair.””
- Second-order effects example. “Warren Buffett used a very apt metaphor once to describe how the second-order problem is best described by a crowd at a parade: Once a few people decide to stand on their tip-toes, everyone has to stand on their tip-toes. No one can see any better, but they’re all worse off.”
- Relying too heavily on slippery-slope arguments is stupid. “Garrett Hardin smartly addresses this in Filters Against Folly: Those who take the wedge (Slippery Slope) argument with the utmost seriousness act as though they think human beings are completely devoid of practical judgment. Countless examples from everyday life show the pessimists are wrong… If we took the wedge argument seriously, we would pass a law forbidding all vehicles to travel at any speed greater than zero. That would be an easy way out of the moral problem. But we pass no such law.”
- When bad things get better, or good things get worse, it might not be causal—it might just be regression to the mean. “Whenever correlation is imperfect, extremes will soften over time. The best will always appear to get worse and the worst will appear to get better, regardless of any additional action. This is called regression to the mean, and it means we have to be extra careful when diagnosing causation.”
- Occam’s Razor explained in math. “Why are more complicated explanations less likely to be true? Let’s work it out mathematically. Take two competing explanations, each of which seem to equally explain a given phenomenon. If one of them requires the interaction of three variables and the other the interaction of thirty variables, all of which must have occurred to arrive at the stated conclusion, which of these is more likely to be in error? If each variable has a 99% chance of being correct, the first explanation is only 3% likely to be wrong. The second, more complex explanation, is about nine times as likely to be wrong, or 26%. The simpler explanation is more robust in the face of uncertainty.”