Two psychologists ended up unlocking important keys to both the mind and to economics. Amos Tversky and Daniel Kahneman created the field of behavioral economics and revolutionized cognitive psychology with the discovery of a set of cognitive and psychological biases that affect our decision-making abilities.
These systematic errors in our thinking and logic affect our everyday choices, behaviors, and evaluations of others. For more on this topic, please also see the Cognitive Distortions and Logical Fallacies data sets.
Heuristics: Mental shortcuts
Psychological biases are often the result of heuristics, which are mental shortcuts that help people make decisions quickly, but sometimes at the expense of accuracy.
One of the most well-known biases is confirmation bias, which is the tendency to search for, interpret, and remember information in a way that confirms one’s pre-existing beliefs or hypotheses. This can lead individuals to ignore or dismiss evidence that challenges their views.
Another common bias is the anchoring effect, where individuals rely too heavily on an initial piece of information, known as the “anchor,” when making decisions. For example, if you are told that a shirt is on sale for $50, down from $100, you might perceive it as a good deal, even if the shirt is not worth $50.
The availability heuristic is a mental shortcut that leads people to overestimate the likelihood of events that are easily recalled. For instance, if someone recently heard about a plane crash, they might overestimate the dangers of flying, even though statistically, it is much safer than driving.
The Dunning-Kruger effect is a cognitive bias where individuals with low ability at a task overestimate their ability. Essentially, they are not skilled enough to recognize their own incompetence. On the flip side, highly competent individuals may underestimate their relative competence.
The halo effect is a type of bias where the perception of one positive trait of a person or thing influences the perception of other traits. For example, if someone is physically attractive, they are often perceived as more intelligent, talented, or kind.
Loss aversion is the tendency to prefer avoiding losses over acquiring equivalent gains. People are generally more upset about losing $20 than they are happy about gaining $20. This bias can lead to risk-averse behavior.
The bandwagon effect refers to the tendency of people to align their beliefs and behaviors with those of a group. This can be seen in various social phenomena such as fashion trends and political movements.
The hindsight bias is the inclination to see events as being more predictable after they have happened. People often believe that they βknew it all along,β which can create overconfidence in their ability to predict events.
These are just a handful of the full list of 30 psychological biases detailed below in the dictionary table. Arm yourself with awareness of these biases, as striving to think critically can help in making more rational and informed decisions.
Psychological biases dictionary
Psychological bias | Explanation | Example |
---|---|---|
action bias | Belief that when we're faced with an ambiguous situation or challenge, that we must take some action vs. doing nothing, whether doing something is a good idea or not (and often quickly, without taking the time to fully examine the problem); also known as "naive interventionism" | sports enthusiasts rooting for their favorite teams are notorious for the superstitious rituals they are in psychological anguish if not able to perform, despite the objective fact that they have no ability whatsoever to affect the outcome (in pop culture, Robert DeNiro's character in Silver Linings Playbook exemplifies this) |
adjustment heuristic | Tendency to start from an implicitly suggested reference point when assessing probabilities (the "anchor") and making adjustments to that reference point to reach an estimate | |
affect heuristic | We tend to underestimate the role of feelings of liking & disliking in our judgments and decision-making | Instead of considering risks and benefits independently, individuals with a negative attitude towards nuclear power may consider its benefits as low and risks as high, thereby leading to a more negative risk-benefit correlation than would be evident under conditions without time pressure (Finucane, Alhakami, Slovic, & Johnson, 2000) |
anchoring effect | Fixating on a value or # that gets compared to everything else, b/c we tend to compare/contrast limited sets of items (aka βrelativity trapβ) β store sale items take advantage of this (so we compare the new value to the old, but not the old value on its own as a measure of worth) | |
availability heuristic | Tendency to make quick "intuitive" judgments about the size of given categories by the ease with which particular instances/examples of the class come to mind | |
bandwagon effect | Similar to groupthink, arising from our built-in desire to fit in and conform, we tend to "go along with the trend" when it becomes apparent to us | |
contagion heuristic | Tendency to avoid contact with people or objects viewed as "contaminated" by previous contact with someone or something else viewed as "bad" | Related to/inclusive of magical thinking β believing a person's sweater still carries their "essence," e.g. |
confirmation bias | We tend to agree w/those who agree with us & avoid associating with those who don't, to avoid the discomfort of cognitive dissonance (the Internet has sadly made this worse) | |
conjunction fallacy | A formal fallacy that occurs when one believes a specific condition is more probable than a general one | |
current moment bias | Preference to experience pleasure now, & put off the βpainβ til later; lack of ability to imagine ourselves in the future & altering today's behaviors accordingly | |
disjunction fallacy | Misjudging that the disjunction of two events must be as likely as either of the events individually (as definitionally, via probability theory) | |
false consensus effect | People tend to overestimate the degree to which the general public shares their beliefs and opinions | potentially related to the availability heuristic, the self-serving bias, and naive realism |
focusing illusion | Placing too much emphasis on one aspect of an event, outweighing its importance and causing error in judgment | |
Gambler's fallacy | Putting a tremendous amount of weight on previous events, believing they will influence future outcomes (even when outcome is random) | also frequently a logical fallacy |
Identifiable Victim Effect | Tendency for people to care deeply about a single, specific tragedy but seem disinterested in vast atrocities affecting thousands or millions of people | more broadly, abstract concepts motivate us less than individual cases (especially when given visual evidence) |
ingroup bias | Overestimating abilities and values of our immediate group & underestimating that of outgroups (oxytocin plays a role) | |
naive realism | The belief that each one of us sees the world objectively, while the people who disagree with us must be either uninformed or irrational | "Everyone is influenced by ideology and self-interest. Except for me. I see things as they are." |
negativity bias | We pay more attention to bad news | |
neglecting probability | Reason we're afraid to fly even though it's statistically far more likely to be in a car accident (same way we fear terrorism but not more mundane accidents that are far more likely) | |
observational selection bias | Suddenly noticing things we didn't notice before & assuming frequency has increased (also contributes to feeling appearance of certain things or events can't be coincidence) | |
optimism bias | Tendency to believe that good things happen more often than bad things | |
planning fallacy | Systematic tendency toward unrealistic optimism about the time it takes to comple | |
positive expectation bias | Sense that our luck has to change for the better | |
post-purchase rationalization | Making ourselves feel better after we make crappy decisions (aka Buyer's Stockholm Syndrome) | |
projection bias | Assumption that most people think just like us (false consensus bias is related: thinking that others agree with us) | |
resemblance bias | Tendency to ignore statistical facts and use resemblance as a simplifying heuristic to make difficult judgments | |
self-serving bias | Tendency to evaluate ambiguous or complex information in a way that is beneficial to the speaker's interests, as well as to claim responsibility for successes and attribute failures to others or to uncontrollable external factors | |
shifting baseline syndrome | We tend to use very recent data points in our research (even when more data is available) and thus can miss picking up on some long-term trends | |
status-quo bias | We fear change, so tend to make choices that guarantee things remain the same (& by extension, assume that any other choice will be inferior, or make things worse) | |
treadmill effect | Our desire for the new version of a product or service is acute, even if upgrades are minor & incremental; but the pleasure we get from the new object wears off quickly to leave us back at the original satisfaction baseline |
Read More:
Top Mental Models for Thinkers β
Model thinking is an excellent way of improving our cognition and decision making abilities.
28 Cognitive distortions list β
Cognitive distortions are bad mental habits and unhelpful ways of thinking that can limit one’s ability to function in the world.
24 Logical fallacies list β
Recognizing and avoiding logical fallacies is essential for critical thinking and effective communication.
Comments are closed.