Two psychologists ended up unlocking important keys to both the mind and to economics. Amos Tversky and Daniel Kahneman created the field of behavioral economics and revolutionized cognitive psychology with the discovery of a set of cognitive and psychological biases that affect our decision-making abilities.
These systematic errors in our thinking and logic affect our everyday choices, behaviors, and evaluations of others. For more on this topic, please also see the Cognitive Distortions and Logical Fallacies data sets.
Psychological bias | Explanation | Example |
---|---|---|
action bias | Belief that when we're faced with an ambiguous situation or challenge, that we must take some action vs. doing nothing, whether doing something is a good idea or not (and often quickly, without taking the time to fully examine the problem); also known as "naive interventionism" | sports enthusiasts rooting for their favorite teams are notorious for the superstitious rituals they are in psychological anguish if not able to perform, despite the objective fact that they have no ability whatsoever to affect the outcome (in pop culture, Robert DeNiro's character in Silver Linings Playbook exemplifies this) |
adjustment heuristic | Tendency to start from an implicitly suggested reference point when assessing probabilities (the "anchor") and making adjustments to that reference point to reach an estimate | |
affect heuristic | We tend to underestimate the role of feelings of liking & disliking in our judgments and decision-making | Instead of considering risks and benefits independently, individuals with a negative attitude towards nuclear power may consider its benefits as low and risks as high, thereby leading to a more negative risk-benefit correlation than would be evident under conditions without time pressure (Finucane, Alhakami, Slovic, & Johnson, 2000) |
anchoring effect | Fixating on a value or # that gets compared to everything else, b/c we tend to compare/contrast limited sets of items (aka βrelativity trapβ) β store sale items take advantage of this (so we compare the new value to the old, but not the old value on its own as a measure of worth) | |
availability heuristic | Tendency to make quick "intuitive" judgments about the size of given categories by the ease with which particular instances/examples of the class come to mind | |
bandwagon effect | Similar to groupthink, arising from our built-in desire to fit in and conform, we tend to "go along with the trend" when it becomes apparent to us | |
contagion heuristic | Tendency to avoid contact with people or objects viewed as "contaminated" by previous contact with someone or something else viewed as "bad" | Related to/inclusive of magical thinking β believing a person's sweater still carries their "essence," e.g. |
confirmation bias | We tend to agree w/those who agree with us & avoid associating with those who don't, to avoid the discomfort of cognitive dissonance (the Internet has sadly made this worse) | |
conjunction fallacy | A formal fallacy that occurs when one believes a specific condition is more probable than a general one | |
current moment bias | Preference to experience pleasure now, & put off the βpainβ til later; lack of ability to imagine ourselves in the future & altering today's behaviors accordingly | |
disjunction fallacy | Misjudging that the disjunction of two events must be as likely as either of the events individually (as definitionally, via probability theory) | |
false consensus effect | People tend to overestimate the degree to which the general public shares their beliefs and opinions | potentially related to the availability heuristic, the self-serving bias, and naive realism |
focusing illusion | Placing too much emphasis on one aspect of an event, outweighing its importance and causing error in judgment | |
Gambler's fallacy | Putting a tremendous amount of weight on previous events, believing they will influence future outcomes (even when outcome is random) | also frequently a logical fallacy |
Identifiable Victim Effect | Tendency for people to care deeply about a single, specific tragedy but seem disinterested in vast atrocities affecting thousands or millions of people | more broadly, abstract concepts motivate us less than individual cases (especially when given visual evidence) |
ingroup bias | Overestimating abilities and values of our immediate group & underestimating that of outgroups (oxytocin plays a role) | |
naive realism | The belief that each one of us sees the world objectively, while the people who disagree with us must be either uninformed or irrational | "Everyone is influenced by ideology and self-interest. Except for me. I see things as they are." |
negativity bias | We pay more attention to bad news | |
neglecting probability | Reason we're afraid to fly even though it's statistically far more likely to be in a car accident (same way we fear terrorism but not more mundane accidents that are far more likely) | |
observational selection bias | Suddenly noticing things we didn't notice before & assuming frequency has increased (also contributes to feeling appearance of certain things or events can't be coincidence) | |
optimism bias | Tendency to believe that good things happen more often than bad things | |
planning fallacy | Systematic tendency toward unrealistic optimism about the time it takes to comple | |
positive expectation bias | Sense that our luck has to change for the better | |
post-purchase rationalization | Making ourselves feel better after we make crappy decisions (aka Buyer's Stockholm Syndrome) | |
projection bias | Assumption that most people think just like us (false consensus bias is related: thinking that others agree with us) | |
resemblance bias | Tendency to ignore statistical facts and use resemblance as a simplifying heuristic to make difficult judgments | |
self-serving bias | Tendency to evaluate ambiguous or complex information in a way that is beneficial to the speaker's interests, as well as to claim responsibility for successes and attribute failures to others or to uncontrollable external factors | |
shifting baseline syndrome | We tend to use very recent data points in our research (even when more data is available) and thus can miss picking up on some long-term trends | |
status-quo bias | We fear change, so tend to make choices that guarantee things remain the same (& by extension, assume that any other choice will be inferior, or make things worse) | |
treadmill effect | Our desire for the new version of a product or service is acute, even if upgrades are minor & incremental; but the pleasure we get from the new object wears off quickly to leave us back at the original satisfaction baseline |
Comments are closed.