Thinking, Fast and Slow
Thinking, Fast and Slow is a 2011 popular science book by the Israeli-American psychologist Daniel Kahneman. Its main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.
The book delineates rational and non-rational motivations or triggers associated with each type of thinking process, and how they complement each other, starting with Kahneman's own research on loss aversion. From framing choices to people's tendency to replace a difficult question with one that is easy to answer, the book summarizes several decades of research to suggest that people have too much confidence in human judgement. Kahneman performed his own research, often in collaboration with the psychologist Amos Tversky, which enriched his experience to write the book. It covers different phases of his career: his early work concerning cognitive biases, his work on prospect theory and happiness, and with the Israel Defense Forces.
Jason Zweig, a columnist at The Wall Street Journal, helped write and research the book over two years. The book was a New York Times bestseller and was the 2012 winner of the American National Academies Communication Award for best creative work that helps the public understanding of topics in behavioral science, engineering and medicine. The integrity of some priming studies cited in the book has been called into question in the midst of the psychological replication crisis.
Two systems
In the book's first section, Kahneman describes two different ways the brain forms thoughts:- System 1: Fast, automatic, frequent, emotional, stereotypic, unconscious. Examples of things system 1 can do:
- * determine that an object is at a greater distance than another
- * localize the source of a specific sound
- * complete a common phrase
- * display disgust when seeing a gruesome image
- * solve basic arithmetic
- * read text on a billboard
- * drive a car on an empty road
- * think of a good chess move
- * understand simple sentences
- System 2: Slow, effortful, infrequent, logical, calculating, conscious. Examples of things system 2 can do:
- * prepare for the start of a sprint
- * direct attention towards certain people in a crowded environment
- * look for a person with a particular feature
- * try to recognize a sound
- * sustain a faster-than-normal walking rate
- * determine the appropriateness of a particular action in a social setting
- * count the number of A's or other letters in a given text
- * give one's own telephone number to someone else
- * park into a tight parking space
- * determine the price/quality ratio of two products
- * determine the validity of a complex logical reasoning
- * multiply two-digit numbers
Heuristics and biases
The second section offers explanations for why humans struggle to think statistically. It begins by documenting a variety of situations in which we either arrive at binary decisions or fail to associate precisely reasonable probabilities with outcomes. Kahneman explains this phenomenon using the theory of heuristics. Kahneman and Tversky originally discussed this topic in their 1974 article titled Judgment Under Uncertainty: Heuristics and Biases.Kahneman uses heuristics to assert that System 1 thinking involves associating new information with existing patterns, or thoughts, rather than creating new patterns for each new experience. For example, a child who has only seen shapes with straight edges might perceive an octagon when first viewing a circle. As a legal metaphor, a judge limited to heuristic thinking would only be able to think of similar historical cases when presented with a new dispute, rather than considering the unique aspects of that case. In addition to offering an explanation for the statistical problem, the theory also offers an explanation for human biases.
Anchoring
The "anchoring effect" names a tendency to be influenced by irrelevant numbers. Shown greater/lesser numbers, experimental subjects gave greater/lesser responses. As an example, most people, when asked whether Mahatma Gandhi was more than 114 years old when he died, will provide a much greater estimate of his age at death than others who were asked whether Gandhi was more or less than 35 years old. Experiments show that people's behavior is influenced, much more than they are aware, by irrelevant information.Availability
The availability heuristic is a mental shortcut that occurs when people make judgments about the probability of events on the basis of how easy it is to think of examples. The availability heuristic operates on the notion that, "if you can think of it, it must be important". The availability of consequences associated with an action is related positively to perceptions of the magnitude of the consequences of that action. In other words, the easier it is to recall the consequences of something, the greater we perceive these consequences to be. Sometimes, this heuristic is beneficial, but the frequencies at which events come to mind are usually not accurate representations of the probabilities of such events in real life.Conjunction fallacy
System 1 is prone to substituting a simpler question for a difficult one. In what Kahneman terms their "best-known and most controversial" experiment, "the Linda problem", subjects were told about an imaginary Linda, young, single, outspoken, and intelligent, who, as a student, was very concerned with discrimination and social justice. They asked whether it was more probable that Linda is a bank teller or that she is a bank teller and an active feminist. The overwhelming response was that "feminist bank teller" was more likely than "bank teller", violating the laws of probability.. In this case System 1 substituted the easier question, "Is Linda a feminist?", neglecting the occupation qualifier. An alternative interpretation is that the subjects added an unstated cultural implicature to the effect that the other answer implied an exclusive or, that Linda was not a feminist.Optimism and loss aversion
Kahneman writes of a "pervasive optimistic bias", which "may well be the most significant of the cognitive biases." This bias generates the illusion of control: the illusion that we have substantial control of our lives.A natural experiment reveals the prevalence of one kind of unwarranted optimism. The planning fallacy is the tendency to overestimate benefits and underestimate costs, impelling people to begin risky projects. In 2002, American kitchen remodeling was expected on average to cost $18,658, but actually cost $38,769.
To explain overconfidence, Kahneman introduces the concept he terms What You See Is All There Is. This theory states that when the mind makes decisions, it deals primarily with Known Knowns, phenomena it has observed already. It rarely considers Known Unknowns, phenomena that it knows to be relevant but about which it does not have information. Finally it appears oblivious to the possibility of Unknown Unknowns, unknown phenomena of unknown relevance.
He explains that humans fail to take into account complexity and that their understanding of the world consists of a small and necessarily un-representative set of observations. Furthermore, the mind generally does not account for the role of chance and therefore falsely assumes that a future event will be similar to a past event.
Framing
Framing is the context in which choices are presented. Experiment: subjects were asked whether they would opt for surgery if the "survival" rate is 90 percent, while others were told that the mortality rate is 10 percent. The first framing increased acceptance, even though the situation was no different.Sunk cost
Rather than consider the odds that an incremental investment would produce a positive return, people tend to "throw good money after bad" and continue investing in projects with poor prospects that have already consumed significant resources. In part this is to avoid feelings of regret.Overconfidence
This part of the book is dedicated to the undue confidence in what the mind believes it knows. It suggests that people often overestimate how much they understand about the world and underestimate the role of chance in particular. This is related to the excessive certainty of hindsight, when an event seems to be understood after it has occurred or developed. Kahneman's opinions concerning overconfidence are influenced by the author Nassim Nicholas Taleb.Choices
In this section Kahneman returns to economics and expands his seminal work on Prospect Theory. He discusses the tendency for problems to be addressed in isolation and how, when other reference points are considered, the choice of that reference point has a disproportionate effect on the outcome. This section also offers advice on how some of the shortcomings of System 1 thinking can be avoided.Prospect theory
Kahneman developed prospect theory, the basis for his Nobel prize, to account for experimental errors he noticed in Daniel Bernoulli's traditional utility theory. According to Kahneman, Utility Theory makes logical assumptions of economic rationality that do not represent people's actual choices, and does not take into account cognitive biases.One example is that people are loss-averse: they are more likely to act to avert a loss than to achieve a gain. Another example is that the value people place on a change in probability depends on the reference point: people seem to place greater value on a change from 0% to 10% than from, say, 45% to 55%, and they place the greatest value of all on a change from 90% to 100%. This occurs despite the fact that by traditional utility theory all three changes give the same increase in utility. Consistent with loss-aversion, the order of the first and third of those is reversed when the event is presented as losing rather than winning something: there, the greatest value is placed on eliminating the probability of a loss to 0.
After the book's publication, the Journal of Economic Literature published a discussion of its parts concerning prospect theory, as well as an analysis of the four fundamental factors on which it is based.