Thinking, Fast and Slow concerns a few major questions: how do we make decisions? And in what ways do we make decisions poorly?

Summary

Thinking, Fast and Slow concerns a few major questions: how do we make decisions? And in what ways do we make decisions poorly?

The book covers three areas of Daniel Kahneman’s research: cognitive biases, prospect theory, and happiness.

System 1 and 2

Kahneman defines two systems of the mind.

System 1: operates automatically and quickly, with little or no effort, and no sense of voluntary control

  • Examples: Detect that one object is farther than another; detect sadness in a voice; read words on billboards; understand simple sentences; drive a car on an empty road.

System 2: allocates attention to the effortful mental activities that demand it, including complex computations. Often associated with the subjective experience of agency, choice and concentration

  • Examples: Focus attention on a particular person in a crowd; exercise faster than is normal for you; monitor your behavior in a social situation; park in a narrow space; multiple 17 x 24.

System 1 automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions.

System 1 can be completely involuntary. You can’t stop your brain from completing 2 + 2 = ?, or from considering a cheesecake as delicious. You can’t unsee optical illusions, even if you rationally know what’s going on.

A lazy System 2 accepts what the faulty System 1 gives it, without questioning. This leads to cognitive biases. Even worse, cognitive strain taxes System 2, making it more willing to accept System 1. Therefore, we’re more vulnerable to cognitive biases when we’re stressed.

Because System 1 operates automatically and can’t be turned off, biases are difficult to prevent. Yet it’s also not wise (or energetically possible) to constantly question System 1, and System 2 is too slow to substitute in routine decisions. We should aim for a compromise: recognize situations when we’re vulnerable to mistakes, and avoid large mistakes when the stakes are high.

Cognitive Biases and Heuristics

Despite all the complexities of life, notice that you’re rarely stumped. You rarely face situations as mentally taxing as having to solve 9382 x 7491 in your head.

Isn’t it profound how we can make decisions without realizing it? You like or dislike people before you know much about them; you feel a company will succeed or fail without really analyzing it.

When faced with a difficult question, System 1 substitutes an easier question, or the heuristic question. The answer is often adequate, though imperfect.

Consider the following examples of heuristics:

  • Target question: Is this company’s stock worth buying? Will the price increase or decrease?
    • Heuristic question: How much do I like this company?
  • Target question: How happy are you with your life?
    • Heuristic question: What’s my current mood?
  • Target question: How far will this political candidate get in her party?
    • Heuristic question: Does this person look like a political winner?

These are related, but imperfect questions. When System 1 produces an imperfect answer, System 2 has the opportunity to reject this answer, but a lazy System 2 often endorses the heuristic without much scrutiny.

Important Biases and Heuristics

Confirmation bias: We tend to find and interpret information in a way that confirms our prior beliefs. We selectively pay attention to data that fit our prior beliefs and discard data that don’t.

“What you see is all there is”: We don’t consider the global set of alternatives or data. We don’t realize the data that are missing. Related:

  • Planning fallacy: we habitually underestimate the amount of time a project will take. This is because we ignore the many ways things could go wrong and visualize an ideal world where nothing goes wrong.
  • Sunk cost fallacy: we separate life into separate accounts, instead of considering the global account. For example, if you narrowly focus on a single failed project, you feel reluctant to cut your losses, but a broader view would show that you should cut your losses and put your resources elsewhere.

Ignoring reversion to the mean: If randomness is a major factor in outcomes, high performers today will suffer and low performers will improve, for no meaningful reason. Yet pundits will create superficial causal relationships to explain these random fluctuations in success and failure, observing that high performers buckled under the spotlight, or that low performers lit a fire of motivation.

Anchoring: When shown an initial piece of information, you bias toward that information, even if it’s irrelevant to the decision at hand. For instance, in one study, when a nonprofit requested $400, the average donation was $143; when it requested $5, the average donation was $20. The first piece of information (in this case, the suggested donation) influences our decision (in this case, how much to donate), even though the suggested amount shouldn’t be relevant to deciding how much to give.

Representativeness: You tend to use your stereotypes to make decisions, even when they contradict common sense statistics. For example, if you’re told about someone who is meek and keeps to himself, you’d guess the person is more likely to be a librarian than a construction worker, even though there are far more of the latter than the former in the country.

Availability bias: Vivid images and stronger emotions make items easier to recall and are overweighted. Meanwhile, important issues that do not evoke strong emotions and are not easily recalled are diminished in importance.

Narrative fallacy: We seek to explain events with coherent stories, even though the event may have occurred due to randomness. Because the stories sound plausible to us, it gives us unjustified confidence about predicting the future.

Prospect Theory

Traditional Expected Utility Theory

Traditional “expected utility theory” asserts that people are rational agents that calculate the utility of each situation and make the optimum choice each time.

If you preferred apples to bananas, would you rather have a 10% chance of winning an apple, or 10% chance of winning a banana? Clearly you’d prefer the former.

The expected utility theory explained cases like these, but failed to explain the phenomenon of risk aversion, where in some situations a lower-expected-value choice was preferred.

Consider: Would you rather have an 80% chance of gaining $100 and a 20% chance to win $10, or a certain gain of $80?

The expected value of the former is greater (at $82) but most people choose the latter. This makes no sense in classic utility theory—you should be willing to take a positive expected value gamble every time.

Furthermore, it ignores how differently we feel in the case of gains and losses. Say Anthony has $1 million and Beth has $4 million. Anthony gains $1 million and Beth loses $2 million, so they each now have $2 million. Are Anthony and Beth equally happy?

Obviously not - Beth lost, while Anthony gained. Puzzling with this concept led Kahneman to develop prospect theory.

Prospect Theory

The key insight from the above example is that evaluations of utility are not purely dependent on the current state. Utility depends on changes from one’s reference point. Utility is attached to changes of wealth, not states of wealth. And losses hurt more than gains.

Prospect theory can be summarized in 3 points:

1. When you evaluate a situation, you compare it to a neutral reference point.

  • Usually this refers to the status quo you currently experience. But it can also refer to an outcome you expect or feel entitled to, like an annual raise. When you don’t get something you expect, you feel crushed, even though your status quo hasn’t changed.

2. Diminishing marginal utility applies to changes in wealth (and to sensory inputs).

  • Going from $100 to $200 feels much better than going from $900 to $1,000. The more you have, the less significant the change feels.

3. Losses of a certain amount trigger stronger emotions than a gain of the same amount.

  • Evolutionarily, the organisms that treated threats more urgently than opportunities tended to survive and reproduce better. We have evolved to react extremely quickly to bad news.

There are a few practical implications of prospect theory.

Possibility Effect

Consider which is more meaningful to you:

  • Going from a 0% chance of winning $1 million to 5% chance
  • Going from a 5% chance of winning $1 million to 10% chance

Most likely you felt better about the first than the second. The mere possibility of winning something (that may still be highly unlikely) is overweighted in its importance. We fantasize about small chances of big gains. We obsess about tiny chances of very bad outcomes.

Certainty Effect

Now consider how you feel about these options on the opposite end of probability:

  • In a surgical procedure, going from a 90% success rate to 95% success rate.
  • In a surgical procedure, going from a 95% success rate to 100% success rate

Most likely, you felt better about the second than the first. Outcomes that are almost certain are given less weight than their probability justifies. 95% success rate is actually fantastic! But it doesn’t feel this way, because it’s not 100%.

Status Quo Bias

You like what you have and don’t want to lose it, even if your past self would have been indifferent about having it. For example, if your boss announces a raise, then ten minutes later said she made a mistake and takes it back, this is experienced as a dramatic loss. However, if you heard about this happening to someone else, you likely would see the change as negligible.

Framing Effects

The context in which a decision is made makes a big difference in the emotions that are invoked and the ultimate decision. Even though a gain can be logically equivalently defined as a loss, because losses are so much more painful, different framings may feel very different.

For example, a medical procedure with a 90% chance of survival sounds more appealing than one with a 10% chance of mortality, even though they’re identical.

Happiness and the Two Selves

The new focus of Kahneman’s recent research is happiness. Happiness is a tricky concept. There is in-the-moment happiness, and there is overall well being. There is happiness we experience, and happiness we remember.

Kahneman presents two selves:

  • The experiencing self: the person who feels pleasure and pain, moment to moment. This experienced utility would best be assessed by measuring happiness over time, then summing the total happiness felt over time. (In calculus terms, this is integrating the area under the curve.)
  • The remembering self: the person who reflects on past experiences and evaluates it overall.

The remembering self factors heavily in our thinking. After a moment has passed, only the remembering self exists when thinking about our past lives. The remembering self is often the one making future decisions.

But the remembering self evaluates differently from the experiencing self in two critical ways:

  • Peak-end rule: The overall rating is determined by the peak intensity of the experience and the end of the experience. It does not care much about the averages throughout the experience.
  • Duration neglect: The duration of the experience has little effect on the memory of the event.

We tend to prioritize the remembering self (such as when we choose where we book vacations, or in our willingness to endure pain we will forget later) and don’t give enough to the experiencing self.

For example, would you take a vacation that was very enjoyable, but required that at the end you take a pill that gives you total amnesia of the event? Most would decline, suggesting that memories are a key, perhaps dominant, part of the value of vacations. The remembering self, not the experiencing self, chooses vacations!

Kahneman’s push is to weight the experiencing self more. Spend more time on things that give you moment-to-moment pleasures, and diminish moment-to-moment pain. Try to reduce your commute, which is a common source of experienced misery. Spend more time in active pleasure activities, such as socializing and exercise.

Focusing Illusion

Considering overall life satisfaction is a difficult System 2 question. When considering life satisfaction, it’s difficult to consider all the factors in your life, weigh those factors accurately, then score your factors.

As is typical, System 1 substitutes the answer to an easier question, such as “what is my mood right now?”, focusing on significant events (both achievements and failures), or recurrent concerns (like illness).

The key point: Nothing in life is as important as you think it is when you are thinking about it. Your mood is largely determined by what you attend to. You get pleasure/displeasure from something when you think about it.

For example, even though Northerners despise their weather and Californians enjoy theirs, in research studies, climate makes no difference in life satisfaction. Why is this? When people are asked about life satisfaction, climate is just a small factor in the overall question—they’re much more worried about their career, their love life, and the bills they need to pay.

When you forecast your own future happiness, you overestimate the effect a change will have on you (like getting a promotion), because you overestimate how salient the thought will be in future you’s mind. In reality, future you has gotten used to the new environment and now has other problems to worry about.

Other Insights

HOW TO WRITE A PERSUASIVE MESSAGE:

  1. Maximize legibility

  1. Use simple language

  2. Rhyming

  3. Easily pronounced source
    The key is still about the idea itself, but said methods do help. ​​​​