Cognitive bias
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.
While cognitive biases may initially appear to be negative, some are adaptive. They may lead to more effective actions in a given context. Furthermore, allowing cognitive biases enables faster decisions which can be desirable when timeliness is more valuable than accuracy, as illustrated in heuristics. Other cognitive biases are a "by-product" of human processing limitations, resulting from a lack of appropriate mental mechanisms, the impact of an individual's constitution and biological state, or simply from a limited capacity for information processing. Cognitive biases can make individuals more inclined to endorsing pseudoscientific beliefs by requiring less evidence for claims that confirm their preconceptions. This can potentially distort their perceptions and lead to inaccurate judgments.
A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics. The study of cognitive biases has practical implications for areas including clinical judgment, entrepreneurship, finance, and management.
Overview
When making judgments under uncertainty, people rely on mental shortcuts or heuristics, which provide swift estimates about the possibility of uncertain occurrences. For example,the representativeness heuristic is defined as the tendency to judge the frequency or likelihood of an occurrence by the extent of which the event resembles the typical case. Similarly the availability heuristic is that individuals estimate the likelihood of events by how easy they are to recall, and the anchoring heuristic prefers the initial reference points that are recalled. While these heuristics are efficient and simple for the brain to compute, they sometimes introduce predictable and systematic cognitive errors, or biases.
The "Linda Problem" illustrates the representativeness heuristic and corresponding bias. Participants were given a description of "Linda" that suggests Linda might well be a feminist. They were then asked whether they thought Linda was more likely to be a "bank teller" or a "bank teller and active in the feminist movement." A majority chose answer. Independent of the information given about Linda, though, the more restrictive answer is under any circumstance statistically less likely than answer. This is an example of the conjunction fallacy: respondents chose because it seemed more "representative" or typical of persons who might fit the description of Linda. The representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgments of others.
Gerd Gigerenzer argues that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases. They should rather conceive rationality as an adaptive tool, not identical to the rules of formal logic or the probability calculus. Gigerenzer believes that cognitive biases are not biases, but rules of thumb, or as he would put it "gut feelings" that can actually help us make accurate decisions in our lives. There is not clear evidence that these behaviors are genuinely, severely biased once the actual problems people face are understood. Advances in economics and cognitive neuroscience now suggest that many behaviors previously labeled as biases might instead represent optimal decision-making strategies.
Definitions
History
The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 and grew out of their experience of people's innumeracy, or inability to reason intuitively with the greater orders of magnitude. Tversky, Kahneman, and colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. Their 1974 paper, Judgment under Uncertainty: Heuristics and Biases, outlined how people rely on mental shortcuts when making judgments under uncertainty. Experiments such as the "Linda problem" grew into heuristics and biases research programs, which spread beyond academic psychology into other disciplines including medicine and political science.The list of cognitive biases has long been a topic of critique. In psychology a "rationality war" unfolded between Gerd Gigerenzer and the Kahneman and Tversky school, which pivoted on whether biases are primarily defects of human cognition or the result of behavioural patterns that are actually adaptive or "ecologically rational". Gerd Gigerenzer has historically been one of the main opponents to cognitive biases and heuristics. This debate has recently reignited, with critiques arguing there has been an overemphasis on biases in human cognition.
introduced the concept of cognitive bias modification, which focuses on reducing maladaptive cognitive patterns through computer-based attention training and behavioral tasks.
Types
Biases can be distinguished on a number of dimensions. Examples of cognitive biases include -- Biases specific to groups versus biases at the individual level.
- Biases that affect decision-making, where the desirability of options has to be considered.
- Biases, such as illusory correlation, that affect judgment of how likely something is or whether one thing is the cause of another.
- Biases that affect memory, such as consistency bias.
- Biases that reflect a subject's motivation, for example, the desire for a positive self-image leading to egocentric bias and the avoidance of unpleasant cognitive dissonance.
- some are due to ignoring relevant information,
- some involve a decision or judgment being affected by irrelevant information, and
- others give excessive weight to an unimportant but salient feature of the problem.
Some cognitive biases belong to the subgroup of attentional biases, which refers to paying increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop task and the dot probe task.
Individuals' susceptibility to some types of cognitive biases can be measured by the Cognitive Reflection Test developed by Shane Frederick.
List of biases
The following is a list of the more commonly studied cognitive biases:| Name | Description |
| Fundamental attribution error | Tendency to overemphasize personality-based explanations for behaviors observed in others. At the same time, individuals under-emphasize the role and power of situational influences on the same behavior. Edward E. Jones and Victor A. Harris' classic study illustrates the FAE. Despite being made aware that the target's speech direction was assigned to the writer, participants ignored the situational pressures and attributed pro-Castro attitudes to the writer when the speech represented such attitudes. |
| Implicit bias | Tendency to attribute positive or negative qualities to a group of individuals. It can be fully non-factual or be an abusive generalization of a frequent trait in a group to all individuals of that group. |
| Priming bias | Tendency to be influenced by the first presentation of an issue to create our preconceived idea of it, which we then can adjust with later information. |
| Confirmation bias | Tendency to search for or interpret information in a way that confirms one's preconceptions, and discredit information that does not support the initial opinion. Related to the concept of cognitive dissonance, in that individuals may reduce inconsistency by searching for information which reconfirms their views. |
| Affinity bias | Tendency to be favorably biased toward people most like ourselves. |
| Self-serving bias | Tendency to claim more responsibility for successes than for failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests. |
| Belief bias | Tendency to evaluate the logical strength of an argument based on current belief and perceived plausibility of the statement's conclusion. |
| Framing | Tendency to narrow the description of a situation in order to guide to a selected conclusion. The same primer can be framed differently and therefore lead to different conclusions. |
| Hindsight bias | Tendency to view past events as being predictable. Also called the "I-knew-it-all-along" effect. |
| Embodied cognition | Tendency to have selectivity in perception, attention, decision making, and motivation based on the biological state of the body. |
| Anchoring bias | The inability of people to make appropriate adjustments from a starting point in response to a final answer. It can lead people to make sub-optimal decisions. Anchoring affects decision making in negotiations, medical diagnoses, and judicial sentencing. |
| Status quo bias | Tendency to hold to the current situation rather than an alternative situation, to avoid risk and loss. In status quo bias, a decision-maker has the increased propensity to choose an option because it is the default option or status quo. Has been shown to affect various important economic decisions, for example, a choice of car insurance or electrical service. |
| Overconfidence effect | Tendency to overly trust one's own capability to make correct decisions. People tended to overrate their abilities and skills as decision makers. See also the Dunning–Kruger effect. |
| Physical attractiveness stereotype | The tendency to assume people who are physically attractive also possess other desirable personality traits. |
| Halo Effect | Tendency for positive impressions to contaminate other evaluations. In marketing, it may manifest itself in positive bias towards a certain product based on previous positive experiences with another product from the same brand. In psychology, the halo effect explains why people often assume individuals who are viewed as attractive to be also popular, successful, and happy. |