bias

Cultivation theory is a significant concept in media studies, particularly within the context of psychology and how media influences viewers. Developed by George Gerbner in the 1960s, cultivation theory addresses the long-term effects that television has on the perceptions of the audience about reality. This overview will discuss the origins of the theory, its key components, the psychological mechanisms it suggests, and how it applies to modern media landscapes.

Origins and development

Cultivation theory emerged from broader concerns about the effects of television on viewers over long periods. To study those effects, George Gerbner, along with his colleagues at the Annenberg School for Communication at the University of Pennsylvania, initiated the Cultural Indicators Project in the mid-1960s.

This large-scale research project aimed to study how television content affected viewers’ perceptions of reality. Gerbner’s research focused particularly on the cumulative and overarching impact of television as a medium rather than the effects of specific programs.

Core components of cultivation theory

The central hypothesis of cultivation theory is that those who spend more time watching television are more likely to perceive the real world in ways that reflect the most common and recurrent messages of the television world, compared to those who watch less television. This effect is termed ‘cultivation.’

1. Message System Analysis: This involves the study of content on television to understand the recurring and dominant messages and images presented.

2. Cultivation Analysis: This refers to research that examines the long-term effects of television. The focus is on the viewers’ conceptions of reality and whether these conceptions correlate with the world portrayed on television.

3. Mainstreaming and Resonance: Mainstreaming is the homogenization of viewers’ perceptions as television’s ubiquitous narratives become the dominant source of information and reality. Resonance occurs when viewers’ real-life experiences confirm the mediated reality, intensifying the cultivation effect.

Psychological mechanisms

Cultivation theory suggests several psychological processes that explain how media exposure shapes perceptions:

  • Heuristic Processing: Television can lead to heuristic processing, a kind of psychological biasing where viewers use shortcuts in thinking to quickly assess reality based on the most frequently presented images and themes in media.
  • Social Desirability: Television often portrays certain behaviors and lifestyles as more desirable or acceptable, which can influence viewers to adopt these standards as their own.
  • The Mean World Syndrome: A significant finding from cultivation research is that heavy viewers of television tend to believe that the world is a more dangerous place than it actually is, a phenomenon known as the “mean world syndrome.” This is particularly pronounced in genres rich in violence, like crime dramas and news.

Critiques and modern perspectives

Cultivation theory has faced various critiques and adaptations over the years. Critics argue that the theory underestimates viewer agency and the role of individual differences in media consumption. It is also said to lack specificity regarding how different genres of television might affect viewers differently.

Furthermore, with the advent of digital media, the theory’s focus on television as the sole medium of significant influence has been called into question. Modern adaptations of cultivation theory have begun to consider the effects of internet usage, social media, and platform-based viewing, which also offer repetitive and pervasive content capable of shaping perceptions.

Application to modern media

Today, cultivation theory is still relevant as it can be applied to the broader media landscape, including online platforms where algorithms dictate the content viewers receive repetitively. For example, the way social media can affect users’ perceptions of body image, social norms, or even political ideologies often mirrors the longstanding concepts of cultivation theory.

In conclusion, cultivation theory provides a critical framework for understanding the psychological impacts of media on public perceptions and individual worldviews. While originally developed in the context of television, its core principles are increasingly applicable to various forms of media, offering valuable insights into the complex interplay between media content, psychological processes, and the cultivation of perception in the digital age.

Read more

The concept of a “confirmation loop” in psychology is a critical element to understand in the contexts of media literacy, disinformation, and political ideologies. It operates on the basic human tendency to seek out, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses, known as confirmation bias. This bias is a type of cognitive bias and a systematic error of inductive reasoning that affects the decisions and judgments that people make.

Understanding the confirmation loop

A confirmation loop occurs when confirmation bias is reinforced in a cyclical manner, often exacerbated by the selective exposure to information that aligns with one’s existing beliefs. In the digital age, this is particularly prevalent due to the echo chambers created by online social networks and personalized content algorithms.

These technologies tend to present us with information that aligns with our existing views, thus creating a loop where our beliefs are constantly confirmed, and alternative viewpoints are rarely encountered. This can solidify and deepen convictions, making individuals more susceptible to disinformation and conspiracy theories, and less tolerant of opposing viewpoints.

Media literacy and disinformation

Media literacy is the ability to identify different types of media and understand the messages they’re sending. It’s crucial in breaking the confirmation loop as it involves critically evaluating sources of information, their purposes, and their impacts on our thoughts and beliefs.

With the rise of digital media, individuals are bombarded with an overwhelming amount of information, making it challenging to distinguish between credible information and disinformation. It is paramount to find your own set of credible sources, and verify the ethics and integrity of new sources you come across.

Disinformation, or false information deliberately spread to deceive people, thrives in an environment where confirmation loops are strong. Individuals trapped in confirmation loops are more likely to accept information that aligns with their preexisting beliefs without scrutinizing its credibility. This makes disinformation a powerful tool in manipulating public opinion, especially in politically charged environments.

Political ideologies

The impact of confirmation loops on political ideologies cannot be overstated. Political beliefs are deeply held and can significantly influence how information is perceived and processed.

When individuals only consume media that aligns with their political beliefs, they’re in a confirmation loop that can reinforce partisan views and deepen divides. This is particularly concerning in democratic societies where informed and diverse opinions are essential for healthy political discourse.

Operation of the confirmation loop

The operation of the confirmation loop can be seen in various everyday situations. For instance, a person might exclusively watch news channels that reflect their political leanings, follow like-minded individuals on social media, and participate in online forums that share their viewpoints.

Algorithms on many platforms like Facebook and Twitter (X) detect these preferences and continue to feed similar content, thus reinforcing the loop. Over time, this can result in a narrowed perspective, where alternative viewpoints are not just ignored but may also be actively discredited or mocked.

Becoming more aware and breaking the loop

Becoming more aware of confirmation loops and working to break them is essential for fostering open-mindedness and reducing susceptibility to disinformation. Here are several strategies to achieve this:

  1. Diversify Information Sources: Actively seek out and engage with credible sources of information that offer differing viewpoints. This can help broaden your perspective and challenge your preconceived notions.
  2. Critical Thinking: Develop critical thinking skills to analyze and question the information you encounter. Look for evidence, check sources, and consider the purpose and potential biases behind the information.
  3. Media Literacy Education: Invest time in learning about media literacy. Understanding how media is created, its various forms, and its impact can help you navigate information more effectively.
  4. Reflect on Biases: Regularly reflect on your own biases and consider how they might be affecting your interpretation of information. Self-awareness is a crucial step in mitigating the impact of confirmation loops.
  5. Engage in Constructive Dialogue: Engage in respectful and constructive dialogues with individuals who hold different viewpoints. This can expose you to new perspectives and reduce the polarization exacerbated by confirmation loops.

The confirmation loop is a powerful psychological phenomenon that plays a significant role in shaping our beliefs and perceptions, especially in the context of media literacy, disinformation, and political ideologies. By understanding how it operates and actively working to mitigate its effects, individuals can become more informed, open-minded, and resilient against disinformation.

The path toward breaking the confirmation loop involves a conscious effort to engage with diverse information sources, practice critical thinking, and foster an environment of open and respectful discourse.

Read more

Fact-checking is a critical process used in journalism to verify the factual accuracy of information before it’s published or broadcast. This practice is key to maintaining the credibility and ethical standards of journalism and media as reliable information sources. It involves checking statements, claims, and data in various media forms for accuracy and context.

Ethical standards in fact-checking

The ethical backbone of fact-checking lies in journalistic integrity, emphasizing accuracy, fairness, and impartiality. Accuracy ensures information is cross-checked with credible sources. Fairness mandates balanced presentation, and impartiality requires fact-checkers to remain as unbiased in their evaluations as humanly possible.

To evaluate a media source’s credibility, look for a masthead, mission statement, about page, or ethics statement that explains the publication’s approach to journalism. Without a stated commitment to journalistic ethics and standards, it’s entirely possible the website or outlet is publishing opinion and/or unverified claims.

Fact-checking in the U.S.: A historical perspective

Fact-checking in the U.S. has evolved alongside journalism. The rise of investigative journalism in the early 20th century highlighted the need for thorough research and factual accuracy. However, recent developments in digital and social media have introduced significant challenges.

Challenges from disinformation and propaganda

The digital era has seen an explosion of disinformation and propaganda, particularly on social media. ‘Fake news‘, a term now synonymous with fabricated or distorted stories, poses a significant hurdle for fact-checkers. The difficulty lies not only in the volume of information but also in the sophisticated methods used to spread falsehoods, such as deepfakes and doctored media.

Bias and trust issues in fact-checking

The subjectivity of fact-checkers has been scrutinized, with some suggesting that personal or organizational biases might influence their work. This perception has led to a trust deficit in certain circles, where fact-checking itself is viewed as potentially politically or ideologically motivated.

Despite challenges, fact-checking remains crucial for journalism. Future efforts may involve leveraging technology like AI for assistance, though human judgment is still essential. The ongoing battle against disinformation will require innovation, collaboration with tech platforms, transparency in the fact-checking process, and public education in media literacy.

Fact-checking stands as a vital element of journalistic integrity and a bulwark against disinformation and propaganda. In the U.S., and globally, the commitment to factual accuracy is fundamental for a functioning democracy and an informed society. Upholding these standards helps protect the credibility of the media and trusted authorities, and supports the fundamental role of journalism in maintaining an informed public and a healthy democracy.

Read more

The concept of cherry-picking refers to the practice of selectively choosing data or facts that support one’s argument while ignoring those that may contradict it. This method is widely recognized not just as a logical fallacy but also as a technique commonly employed in the dissemination of disinformation. Cherry-picking can significantly impact the way information is understood and can influence political ideology, public opinion, and policy making.

Cherry-picking and disinformation

Disinformation, broadly defined, is false or misleading information that is spread deliberately, often to deceive or mislead the public. Cherry-picking plays a crucial role in the creation and propagation of disinformation.

By focusing only on certain pieces of evidence while excluding others, individuals or entities can create a skewed or entirely false narrative. This manipulation of facts is particularly effective because the information presented can be entirely true in isolation, making the deceit harder to detect. In the realm of disinformation, cherry-picking is a tool to shape perceptions, create false equivalencies, and undermine credible sources of information.

The role of cherry-picking in political ideology

Political ideologies are comprehensive sets of ethical ideals, principles, doctrines, myths, or symbols of a social movement, institution, class, or large group that explains how society should work. Cherry-picking can significantly influence political ideologies by providing a biased view of facts that aligns with specific beliefs or policies.

This biased information can reinforce existing beliefs, creating echo chambers where individuals are exposed only to viewpoints similar to their own. The practice can deepen political divisions, making it more challenging for individuals with differing viewpoints to find common ground or engage in constructive dialogue.

Counteracting cherry-picking

Identifying and countering cherry-picking requires a critical approach to information consumption and sharing. Here are several strategies:

  1. Diversify Information Sources: One of the most effective ways to recognize cherry-picking is by consuming information from a wide range of sources. This diversity of trustworthy sources helps in comparing different viewpoints and identifying when certain facts are being omitted or overly emphasized.
  2. Fact-Checking and Research: Before accepting or sharing information, it’s essential to verify the facts. Use reputable fact-checking organizations and consult multiple sources to get a fuller picture of the issue at hand.
  3. Critical Thinking: Develop the habit of critically assessing the information you come across. Ask yourself whether the evidence supports the conclusion, what might be missing, and whether the sources are credible.
  4. Educate About Logical Fallacies: Understanding and educating others about logical fallacies, like cherry-picking, can help people recognize when they’re being manipulated. This knowledge can foster healthier public discourse and empower individuals to demand more from their information sources.
  5. Promote Media Literacy: Advocating for media literacy education can equip people with the skills needed to critically evaluate information sources, understand media messages, and recognize bias and manipulation, including cherry-picking.
  6. Encourage Open Dialogue: Encouraging open, respectful dialogue between individuals with differing viewpoints can help combat the effects of cherry-picking. By engaging in conversations that consider multiple perspectives, individuals can bridge the gap between divergent ideologies and find common ground.
  7. Support Transparent Reporting: Advocating for and supporting media outlets that prioritize transparency, accountability, and comprehensive reporting can help reduce the impact of cherry-picking. Encourage media consumers to support organizations that make their sources and methodologies clear.

Cherry-picking is a powerful tool in the dissemination of disinformation and in shaping political ideologies. Its ability to subtly manipulate perceptions makes it a significant challenge to open, informed public discourse.

By promoting critical thinking, media literacy, and the consumption of a diverse range of information, individuals can become more adept at identifying and countering cherry-picked information. The fight against disinformation and the promotion of a well-informed public require vigilance, education, and a commitment to truth and transparency.

Read more

“Source amnesia” is a psychological phenomenon that occurs when an individual can remember information but cannot recall where the information came from. In the context of media and disinformation, source amnesia plays a crucial role in how misinformation spreads and becomes entrenched in people’s beliefs. This overview will delve into the nature of source amnesia, its implications for media consumption, and strategies for addressing it.

Understanding source amnesia

Source amnesia is part of the broader category of memory errors where the content of a memory is dissociated from its source. This dissociation can lead to a situation where individuals accept information as true without remembering or critically evaluating where they learned it. The human brain tends to remember facts or narratives more readily than it does the context or source of those facts, especially if the information aligns with pre-existing beliefs or emotions. This bias can lead to the uncritical acceptance of misinformation if the original source was unreliable but the content is memorable.

Source amnesia in the media landscape

The role of source amnesia in media consumption has become increasingly significant in the digital age. The vast amount of information available online and the speed at which it spreads mean that individuals are often exposed to news, facts, and narratives from myriad sources, many of which might be dubious or outright false. Social media platforms, in particular, exacerbate this problem by presenting information in a context where source credibility is often obscured or secondary to engagement.

Disinformation campaigns deliberately exploit source amnesia. They spread misleading or false information, knowing that once the information is detached from its dubious origins, it is more likely to be believed and shared. This effect is amplified by confirmation bias, where individuals are more likely to remember and agree with information that confirms their pre-existing beliefs, regardless of the source’s credibility.

Implications of source amnesia

The implications of source amnesia in the context of media and disinformation are profound. It can lead to the widespread acceptance of false narratives, undermining public discourse and trust in legitimate information sources. Elections, public health initiatives, and social cohesion can be adversely affected when disinformation is accepted as truth due to source amnesia.

The phenomenon also poses challenges for fact-checkers and educators, as debunking misinformation requires not just presenting the facts but also overcoming the emotional resonance and simplicity of the original, misleading narratives.

Addressing source amnesia

Combating source amnesia and its implications for disinformation requires a multi-pronged approach, focusing on education, media literacy, and critical thinking. Here are some strategies:

  1. Media Literacy Education: Teaching people to critically evaluate sources and the context of the information they consume can help mitigate source amnesia. This includes understanding the bias and reliability of different media outlets, recognizing the hallmarks of credible journalism, and checking multiple sources before accepting information as true.
  2. Critical Thinking Skills: Encouraging critical thinking can help individuals question the information they encounter, making them less likely to accept it uncritically. This involves skepticism about information that aligns too neatly with pre-existing beliefs or seems designed to elicit an emotional response.
  3. Source Citing: Encouraging the practice of citing sources in media reports and social media posts can help readers trace the origin of information. This practice can aid in evaluating the credibility of the information and combat the spread of disinformation.
  4. Digital Platforms’ Responsibility: Social media platforms and search engines play a crucial role in addressing source amnesia by improving algorithms to prioritize reliable sources and by providing clear indicators of source credibility. These platforms can also implement features that encourage users to evaluate the source before sharing information.
  5. Public Awareness Campaigns: Governments and NGOs can run public awareness campaigns highlighting the importance of source evaluation. These campaigns can include guidelines for identifying credible sources and the risks of spreading unverified information.

Source amnesia is a significant challenge in the fight against disinformation, making it easy for false narratives to spread unchecked. By understanding this phenomenon and implementing strategies to address it, society can better safeguard against the corrosive effects of misinformation.

It requires a concerted effort from individuals, educators, media outlets, and digital platforms to ensure that the public remains informed and critical in their consumption of information. This collective action can foster a more informed public, resilient against the pitfalls of source amnesia and the spread of disinformation.

Read more

The backfire effect is a cognitive phenomenon that occurs when individuals are presented with information that contradicts their existing beliefs, leading them not only to reject the challenging information but also to further entrench themselves in their original beliefs.

This effect is counterintuitive, as one might expect that presenting factual information would correct misconceptions. However, due to various psychological mechanisms, the opposite can occur, complicating efforts to counter misinformation, disinformation, and the spread of conspiracy theories.

Origin and mechanism

The term “backfire effect” was popularized by researchers Brendan Nyhan and Jason Reifler, who in 2010 conducted studies demonstrating that corrections to false political information could actually deepen an individual’s commitment to their initial misconception. This effect is thought to stem from a combination of cognitive dissonance (the discomfort experienced when holding two conflicting beliefs) and identity-protective cognition (wherein individuals process information in a way that protects their sense of identity and group belonging).

Relation to media, disinformation, echo chambers, and media bubbles

In the context of media and disinformation, the backfire effect is particularly relevant. The proliferation of digital media platforms has made it easier than ever for individuals to encounter information that contradicts their beliefs — but paradoxically, it has also made it easier for them to insulate themselves in echo chambers and media bubblesβ€”environments where their existing beliefs are constantly reinforced and rarely challenged.

Echo chambers refer to situations where individuals are exposed only to opinions and information that reinforce their existing beliefs, limiting their exposure to diverse perspectives. Media bubbles are similar, often facilitated by algorithms on social media platforms that curate content to match users’ interests and past behaviors, inadvertently reinforcing their existing beliefs and psychological biases.

Disinformation campaigns can exploit these dynamics by deliberately spreading misleading or false information, knowing that it is likely to be uncritically accepted and amplified within certain echo chambers or media bubbles. This can exacerbate the backfire effect, as attempts to correct the misinformation can lead to individuals further entrenching themselves in the false beliefs, especially if those beliefs are tied to their identity or worldview.

How the backfire effect happens

The backfire effect happens through a few key psychological processes:

  1. Cognitive Dissonance: When confronted with evidence that contradicts their beliefs, individuals experience discomfort. To alleviate this discomfort, they often reject the new information in favor of their pre-existing beliefs.
  2. Confirmation Bias: Individuals tend to favor information that confirms their existing beliefs and disregard information that contradicts them. This tendency towards bias can lead them to misinterpret or dismiss corrective information.
  3. Identity Defense: For many, beliefs are tied to their identity and social groups. Challenging these beliefs can feel like a personal attack, leading individuals to double down on their beliefs as a form of identity defense.

Prevention and mitigation

Preventing the backfire effect and its impact on public discourse and belief systems requires a multifaceted approach:

  1. Promote Media Literacy: Educating the public on how to critically evaluate sources and understand the mechanisms behind the spread of misinformation can empower individuals to think critically and assess the information they encounter.
  2. Encourage Exposure to Diverse Viewpoints: Breaking out of media bubbles and echo chambers by intentionally seeking out and engaging with a variety of perspectives can reduce the likelihood of the backfire effect by making conflicting information less threatening and more normal.
  3. Emphasize Shared Values: Framing challenging information in the context of shared values or goals can make it less threatening to an individual’s identity, reducing the defensive reaction.
  4. Use Fact-Checking and Corrections Carefully: Presenting corrections in a way that is non-confrontational and, when possible, aligns with the individual’s worldview or values can make the correction more acceptable. Visual aids and narratives that resonate with the individual’s experiences or beliefs can also be more effective than plain factual corrections.
  5. Foster Open Dialogue: Encouraging open, respectful conversations about contentious issues can help to humanize opposing viewpoints and reduce the instinctive defensive reactions to conflicting information.

The backfire effect presents a significant challenge in the fight against misinformation and disinformation, particularly in the context of digital media. Understanding the psychological underpinnings of this effect is crucial for developing strategies to promote a more informed and less polarized public discourse. By fostering critical thinking, encouraging exposure to diverse viewpoints, and promoting respectful dialogue, it may be possible to mitigate the impact of the backfire effect and create a healthier information ecosystem.

Read more

The “wallpaper effect” is a phenomenon in media, propaganda, and disinformation where individuals become influenced or even indoctrinated by being continuously exposed to a particular set of ideas, perspectives, or ideologies. This effect is akin to wallpaper in a room, which, though initially noticeable, becomes part of the unnoticed background over time.

The wallpaper effect plays a significant role in shaping public opinion and individual beliefs, often without the conscious awareness of the individuals affected.

Origins and mechanisms

The term “wallpaper effect” stems from the idea that constant exposure to a specific type of media or messaging can subconsciously influence an individual’s perception and beliefs, similar to how wallpaper in a room becomes a subtle but constant presence. This effect is potentiated by the human tendency to seek information that aligns with existing beliefs, known as confirmation bias. It leads to a situation where diverse viewpoints are overlooked, and a singular perspective dominates an individual’s information landscape.

The wallpaper effect, by DALL-E 3

Media and information bubbles

In the context of media, the wallpaper effect is exacerbated by the formation of information bubbles or echo chambers. These are environments where a person is exposed only to opinions and information that reinforce their existing beliefs.

The rise of digital media and personalized content algorithms has intensified this effect, as users often receive news and information tailored to their preferences, further entrenching their existing viewpoints. Even more insidiously, social media platforms tend to earn higher profits when they fill users’ feeds with ideological perspectives they already agree with. Even more profitable is the process of tilting them towards more extreme versions of those beliefs — a practice that in other contexts we call “radicalization.”

Role in propaganda and disinformation

The wallpaper effect is a critical tool in propaganda and disinformation campaigns. By consistently presenting a specific narrative or viewpoint, these campaigns can subtly alter the perceptions and beliefs of the target audience. Over time, the repeated exposure to these biased or false narratives becomes a backdrop to the individual’s understanding of events, issues, or groups, often leading to misconceptions or unwarranted biases.

Psychological impact

The psychological impact of the wallpaper effect is profound. It can lead to a narrowing of perspective, where individuals become less open to new information or alternative viewpoints. This effect can foster polarized communities and hyper partisan politics, where dialogue and understanding between differing viewpoints become increasingly difficult.

Case studies and examples

Historically, authoritarian regimes have used the wallpaper effect to control public opinion and suppress dissent. By monopolizing the media landscape and continuously broadcasting their propaganda, these regimes effectively shaped the public’s perception of reality.

In contemporary times, this effect is also seen in democracies, where partisan news outlets or social media algorithms create a similar, though more fragmented, landscape of information bubbles.

Counteracting the wallpaper effect

Counteracting the wallpaper effect involves a multifaceted approach. Media literacy education is crucial, as it empowers individuals to critically analyze and understand the sources and content of information they consume.

Encouraging exposure to a wide range of viewpoints and promoting critical thinking skills are also essential strategies. Additionally, reforms in digital media algorithms to promote diverse viewpoints and reduce the creation of echo chambers can help mitigate this effect.

Implications for democracy and society

The wallpaper effect has significant implications for democracy and society. It can lead to a polarized public, where consensus and compromise become challenging to achieve. The narrowing of perspective and entrenchment of beliefs can undermine democratic discourse, leading to increased societal divisions and decreased trust in media and institutions.

The wallpaper effect is a critical phenomenon that shapes public opinion and belief systems. Its influence is subtle yet profound, as constant exposure to a specific set of ideas can subconsciously mold an individual’s worldview. Understanding and addressing this effect is essential in promoting a healthy, informed, and open society. Efforts to enhance media literacy, promote diverse viewpoints, and reform digital media practices are key to mitigating the wallpaper effect and fostering a more informed and less polarized public.

Read more

The “repetition effect” is a potent psychological phenomenon and a common propaganda device. This technique operates on the principle that repeated exposure to a specific message or idea increases the likelihood of its acceptance as truth or normalcy by an individual or the public. Its effectiveness lies in its simplicity and its exploitation of a basic human cognitive bias: the more we hear something, the more likely we are to believe it.

Repetition effect, by Midjourney

Historical context

The repetition effect has been used throughout history, but its most notorious use was by Adolf Hitler and the Nazi Party in Germany. Hitler, along with his Propaganda Minister, Joseph Goebbels, effectively employed this technique to disseminate Nazi ideology and promote antisemitism. In his autobiography “Mein Kampf,” Hitler wrote about the importance of repetition in reinforcing the message and ensuring that it reached the widest possible audience. He believed that the constant repetition of a lie would eventually be accepted as truth.

Goebbels echoed this sentiment, famously stating, “If you tell a lie big enough and keep repeating it, people will eventually come to believe it.” The Nazi regime used this strategy in various forms, including in speeches, posters, films, and through controlled media. The relentless repetition of anti-Semitic propaganda, the glorification of the Aryan race, and the demonization of enemies played a crucial role in the establishment and maintenance of the Nazi regime.

Psychological basis

The effectiveness of the repetition effect is rooted in cognitive psychology. This bias is known as the “illusory truth effect,” where repeated exposure to a statement increases its perceived truthfulness. The phenomenon is tied to the ease with which familiar information is processed. When we hear something repeatedly, it becomes more fluent to process, and our brains misinterpret this fluency as a signal for truth.

Modern era usage

The transition into the modern era saw the repetition effect adapting to new media and communication technologies. In the age of television and radio, political figures and advertisers used repetition to embed messages in the public consciousness. The rise of the internet and social media has further amplified the impact of this technique. In the digital age, the speed and reach of information are unprecedented, making it easier for false information to be spread and for the repetition effect to be exploited on a global scale.

The repetition effect on screens and social media, by Midjourney

Political campaigns, especially in polarized environments, often use the repetition effect to reinforce their messages. The constant repetition of slogans, talking points, and specific narratives across various platforms solidifies these messages in the public’s mind, regardless of their factual accuracy.

Ethical considerations and countermeasures

The ethical implications of using the repetition effect are significant, especially when it involves spreading disinformation or harmful ideologies. It raises concerns about the manipulation of public opinion and the undermining of democratic processes.

To counteract the repetition effect, media literacy and critical thinking are essential. Educating the public about this psychological bias and encouraging skepticism towards repeated messages can help mitigate its influence. Fact-checking and the promotion of diverse sources of information also play a critical role in combating the spread of falsehoods reinforced by repetition.

Repetition effect: A key tool of propaganda

The repetition effect is a powerful psychological tool in the arsenal of propagandists and communicators. From its historical use by Hitler and the fascists to its continued relevance in the digital era, this technique demonstrates the profound impact of repeated messaging on public perception and belief.

While it can be used for benign purposes, such as in advertising or reinforcing positive social behaviors, its potential for manipulation and spreading misinformation cannot be understated. Understanding and recognizing the repetition effect is crucial in developing a more discerning and informed approach to the information we encounter daily.

Read more

Sometimes our minds play tricks on us. They can convince us that untrue things are true, or vice versa.

Cognitive distortions are bad mental habits. They’re patterns of thinking that tend to be negatively slanted, inaccurate, and often repetitive.

These unhelpful ways of thinking can limit one’s ability to function and excel in the world. Cognitive distortions are linked to anxiety, depression, addiction, and eating disorders. They reinforce negative thinking loops, which tend to compound and worsen over time.

Irrational thinking

Cognitive distortions are systematic patterns of thought that can lead to inaccurate or irrational conclusions. These distortions often serve as mental traps, skewing our perception of reality and affecting our emotional well-being. Let’s delve into three common types: emotional reasoning, counterfactual thinking, and catastrophizing.

Mental traps, by Midjourney
  1. Emotional Reasoning: This distortion involves using one’s emotions as a barometer for truth. For example, if you feel anxious, you might conclude that something bad is going to happen, even if there’s no objective evidence to support that belief. Emotional reasoning can create a self-perpetuating cycle: your emotions validate your distorted thoughts, which in turn intensify your emotions.
  2. Counterfactual Thinking: This involves imagining alternative scenarios that could have occurred but didn’t. While this can be useful for problem-solving and learning, it becomes a cognitive distortion when it leads to excessive rumination and regret. For instance, thinking “If only I had done X, then Y wouldn’t have happened” can make you stuck in a loop of what-ifs, preventing you from moving forward.
  3. Catastrophizing: This is the tendency to imagine the worst possible outcome in any given situation. It’s like always expecting a minor stumble to turn into a catastrophic fall. This distortion can lead to heightened stress and anxiety, as you’re constantly bracing for disaster.

More cognitive distortions

Cognitive distortionExplanationExample
all-or-nothing thinkingviewing everything in absolute and extremely polarized terms"nothing good ever happens" or "I'm always behind"
blamingfocusing on other people as source of your negative feelings, & refusing to take responsibility for changing yourself; or conversely, blaming yourself harshly for things that were out of your control
catastrophizingbelief that disaster will strike no matter what, and that what will happen will be too awful to bear"What if tragedy strikes?" "What if it happens to me?"
counterfactual thinkingA kind of mental bargaining or longing to live in the alternate timeline where one had made a different decision"If only I could have done it differently..."
dichotomous thinkingviewing events or people in all-or-nothing terms
discounting positivesclaiming that positive things you or others do are trivial, or ignoring good things that have happened to you
emotional reasoningletting feelings guide interpretation of reality; a way of judging yourself or your circumstances based on your emotions"If I feel that way, it must be true"
filteringmentally "filters out" the positive aspects of a situation while magnifying the negative aspects
fortune-tellingpredicting the future negatively
framing effectstendency for decisions to be shaped by inconsequential features of choice problems
halo effectbelief that one's success in a domain automagically qualifies them to have skills and expertise in other areas
illusory correlationtendency to perceive a relationship between two variables when no relation existshttps://en.m.wikipedia.org/wiki/Illusory_correlation
inability to disconfirmreject any evidence or arguments that might contradict negative thoughts
intuitive heuristicstendency when faced with a difficult question of answering an easier question instead, typically without noticing the substitution
just-world hypothesisbelief that good things tend to happen to good people, while bad things tend to happen to bad people
labelingassigning global negative traits to self & others; making a judgment about yourself or someone else as a person, versus seeing the behavior as something they did that doesn't define them as an individual
ludic fallacyin assessing the potential amount of risk in a system or decision, mistaking the real randomness of life for the well-defined risk of casinos
magical thinkinga way of imagining you can wish reality into existence through the sheer force of your mind. Part of a child developmental phase that not everyone grows out of.http://doctorparadox.net/essays/magical-thinking/
magnificationexaggerating the importance of flaws and problems while minimizing the impact of desirable qualities and achievements
mind readingassuming what someone is thinking w/o sufficient evidence; jumping to conclusions
negative filteringfocusing exclusively on negatives & ignoring positives
nominal realismchild development phase where names of objects aren't just symbols but intrinsic parts of the objects. Sometimes called word realism, and related to magical thinking
overgeneralizingmaking a rule or predicting globally negative patterns on the basis of single incident
projectionattributing qualities to external actors or forces that one feels within and either a) wishes to promote and have echoed back to onself, or b) eradicate or squelch from oneself by believing that the quality exists elsewhere, in others, but not in oneself
provincialismthe tendency to see things only from the point of view of those in charge of our immediate in-groups
shouldsa list of ironclad rules one lives and punishes oneself by"I should exercise more" "I should eat better"
teleological fallacyillusion that you know exactly where you're going, knew exactly where you were going in the past, & that others have succeeded in the past by knowing where they were goingacademia especially is rife with this one
what if?keep asking series of ?s on prospective events & being unsatisfied with any answers

Read More:

30 Common psychological biases β†—

These systematic errors in our thinking and logic affect our everyday choices, behaviors, and evaluations of others.

Top Mental Models for Thinkers β†—

Model thinkingΒ is an excellent way of improving our cognition andΒ decision making abilities.

24 Logical fallacies list β†—

Recognizing and avoiding logical fallacies is essential for critical thinking and effective communication.

Read more

Two psychologists ended up unlocking important keys to both the mind and to economics. Amos Tversky and Daniel Kahneman created the field of behavioral economics and revolutionized cognitive psychology with the discovery of a set of cognitive and psychological biases that affect our decision-making abilities.

These systematic errors in our thinking and logic affect our everyday choices, behaviors, and evaluations of others. For more on this topic, please also see the Cognitive Distortions and Logical Fallacies data sets.

Heuristics: Mental shortcuts

Psychological biases are often the result of heuristics, which are mental shortcuts that help people make decisions quickly, but sometimes at the expense of accuracy.

One of the most well-known biases is confirmation bias, which is the tendency to search for, interpret, and remember information in a way that confirms one’s pre-existing beliefs or hypotheses. This can lead individuals to ignore or dismiss evidence that challenges their views.

Another common bias is the anchoring effect, where individuals rely too heavily on an initial piece of information, known as the “anchor,” when making decisions. For example, if you are told that a shirt is on sale for $50, down from $100, you might perceive it as a good deal, even if the shirt is not worth $50.

The availability heuristic is a mental shortcut that leads people to overestimate the likelihood of events that are easily recalled. For instance, if someone recently heard about a plane crash, they might overestimate the dangers of flying, even though statistically, it is much safer than driving.

The Dunning-Kruger effect is a cognitive bias where individuals with low ability at a task overestimate their ability. Essentially, they are not skilled enough to recognize their own incompetence. On the flip side, highly competent individuals may underestimate their relative competence.

The halo effect is a type of bias where the perception of one positive trait of a person or thing influences the perception of other traits. For example, if someone is physically attractive, they are often perceived as more intelligent, talented, or kind.

Loss aversion is the tendency to prefer avoiding losses over acquiring equivalent gains. People are generally more upset about losing $20 than they are happy about gaining $20. This bias can lead to risk-averse behavior.

The bandwagon effect refers to the tendency of people to align their beliefs and behaviors with those of a group. This can be seen in various social phenomena such as fashion trends and political movements.

The hindsight bias is the inclination to see events as being more predictable after they have happened. People often believe that they β€œknew it all along,” which can create overconfidence in their ability to predict events.

These are just a handful of the full list of 30 psychological biases detailed below in the dictionary table. Arm yourself with awareness of these biases, as striving to think critically can help in making more rational and informed decisions.

Psychological biases dictionary

Psychological biasExplanationExample
action biasBelief that when we're faced with an ambiguous situation or challenge, that we must take some action vs. doing nothing, whether doing something is a good idea or not (and often quickly, without taking the time to fully examine the problem); also known as "naive interventionism"sports enthusiasts rooting for their favorite teams are notorious for the superstitious rituals they are in psychological anguish if not able to perform, despite the objective fact that they have no ability whatsoever to affect the outcome (in pop culture, Robert DeNiro's character in Silver Linings Playbook exemplifies this)
adjustment heuristicTendency to start from an implicitly suggested reference point when assessing probabilities (the "anchor") and making adjustments to that reference point to reach an estimate
affect heuristicWe tend to underestimate the role of feelings of liking & disliking in our judgments and decision-makingInstead of considering risks and benefits independently, individuals with a negative attitude towards nuclear power may consider its benefits as low and risks as high, thereby leading to a more negative risk-benefit correlation than would be evident under conditions without time pressure (Finucane, Alhakami, Slovic, & Johnson, 2000)
anchoring effectFixating on a value or # that gets compared to everything else, b/c we tend to compare/contrast limited sets of items (aka β€œrelativity trap”) β€” store sale items take advantage of this (so we compare the new value to the old, but not the old value on its own as a measure of worth)
availability heuristicTendency to make quick "intuitive" judgments about the size of given categories by the ease with which particular instances/examples of the class come to mind
bandwagon effectSimilar to groupthink, arising from our built-in desire to fit in and conform, we tend to "go along with the trend" when it becomes apparent to us
contagion heuristicTendency to avoid contact with people or objects viewed as "contaminated" by previous contact with someone or something else viewed as "bad"Related to/inclusive of magical thinking β€” believing a person's sweater still carries their "essence," e.g.
confirmation biasWe tend to agree w/those who agree with us & avoid associating with those who don't, to avoid the discomfort of cognitive dissonance (the Internet has sadly made this worse)
conjunction fallacyA formal fallacy that occurs when one believes a specific condition is more probable than a general one
current moment biasPreference to experience pleasure now, & put off the β€œpain” til later; lack of ability to imagine ourselves in the future & altering today's behaviors accordingly
disjunction fallacyMisjudging that the disjunction of two events must be as likely as either of the events individually (as definitionally, via probability theory)
false consensus effectPeople tend to overestimate the degree to which the general public shares their beliefs and opinionspotentially related to the availability heuristic, the self-serving bias, and naive realism
focusing illusionPlacing too much emphasis on one aspect of an event, outweighing its importance and causing error in judgment
Gambler's fallacyPutting a tremendous amount of weight on previous events, believing they will influence future outcomes (even when outcome is random)also frequently a logical fallacy
Identifiable Victim EffectTendency for people to care deeply about a single, specific tragedy but seem disinterested in vast atrocities affecting thousands or millions of peoplemore broadly, abstract concepts motivate us less than individual cases (especially when given visual evidence)
ingroup biasOverestimating abilities and values of our immediate group & underestimating that of outgroups (oxytocin plays a role)
naive realismThe belief that each one of us sees the world objectively, while the people who disagree with us must be either uninformed or irrational"Everyone is influenced by ideology and self-interest. Except for me. I see things as they are."
negativity biasWe pay more attention to bad news
neglecting probabilityReason we're afraid to fly even though it's statistically far more likely to be in a car accident (same way we fear terrorism but not more mundane accidents that are far more likely)
observational selection biasSuddenly noticing things we didn't notice before & assuming frequency has increased (also contributes to feeling appearance of certain things or events can't be coincidence)
optimism biasTendency to believe that good things happen more often than bad things
planning fallacySystematic tendency toward unrealistic optimism about the time it takes to comple
positive expectation biasSense that our luck has to change for the better
post-purchase rationalizationMaking ourselves feel better after we make crappy decisions (aka Buyer's Stockholm Syndrome)
projection biasAssumption that most people think just like us (false consensus bias is related: thinking that others agree with us)
resemblance biasTendency to ignore statistical facts and use resemblance as a simplifying heuristic to make difficult judgments
self-serving biasTendency to evaluate ambiguous or complex information in a way that is beneficial to the speaker's interests, as well as to claim responsibility for successes and attribute failures to others or to uncontrollable external factors
shifting baseline syndromeWe tend to use very recent data points in our research (even when more data is available) and thus can miss picking up on some long-term trends
status-quo biasWe fear change, so tend to make choices that guarantee things remain the same (& by extension, assume that any other choice will be inferior, or make things worse)
treadmill effectOur desire for the new version of a product or service is acute, even if upgrades are minor & incremental; but the pleasure we get from the new object wears off quickly to leave us back at the original satisfaction baseline

Read More:

Top Mental Models for Thinkers β†—

Model thinking is an excellent way of improving our cognition and decision making abilities.

28 Cognitive distortions list β†—

Cognitive distortions are bad mental habits and unhelpful ways of thinking that can limit one’s ability to function in the world.

24 Logical fallacies list β†—

Recognizing and avoiding logical fallacies is essential for critical thinking and effective communication.

Read more

A strong and prevalent cognitive bias that causes a large majority of people to rate themselves more highly and more skilled than statistically possible. Lack of self-awareness can cause us to overestimate our knowledge or ability in a given area, and this phenomenon is known as the Dunning-Kruger Effect.

Posited in 1999 by two Cornell psychologists, Professors Dunning and Kruger also found that low-skilled people often have a double bind: they think of themselves as very skilled, but the lack even the basic level of skill that would allow them to detect and learn from their mistakes to get better. It’s very difficult for them to get out of the “trap” of perceiving themselves as superior, thus obviating any need to continue effort at improvements.

They also found that individuals of high skill levels also suffer from a sort of “lensing effect” (now dubbed the Dunning-Kruger Effect accordingly) in terms of their own self-assessment, but in the other direction — they are not generally aware of the rarity of their gifts. They assume most other people have the same kinds of knowledge and critical thinking skills that they do. In other words, careful study of our images of ourselves found us all to be living in a bubble of inaccurate self-perception, on both ends.

How to counteract the Dunning-Kruger Effect:

  • Ask for feedback from other people, and listen to it honestly.
  • Keep learning and gather knowledge and improving your skills.
Read more

There are many things in life you don’t want to rush through; many experiences you wish to linger. The American cult of efficiency is a kind of over-optimization, and over-fitting of a line that delusionally demands up and to the right every single day, every single quarter, every single time.

The benefits of stopping to smell the flowers have been extolled by sages and philosophers throughout the ages. In all of recorded human history lies some form of the mantra, “haste unto death” — for it is true. We rush headlong off the cliff after all the lemmings ahead of us. We can’t help ourselves — eternal moths to eternal flames.

The slow life

From the cuisine to jurisprudence, from behavior economics to psychological well-being, moving more slowly has numerous well-established benefits. Efficiency should never be the only goal, in any domain or at all times. As James Madison strongly agreed with, “moderation in all things” is the mathematically optimal way to approach life, justice, and governing. Influenced by the Marquis du Condorcet, the invention of statistics, and a distaste for extremism in all forms, The Founders were prescient regarding the later theory of the wisdom of the crowds. They sought to temper the passions of the crowds via checks and balances in our system of governance.

“The arc of the moral universe is long, but it bends toward justice,” said Martin Luther King, Jr. That the veracity of the quote remains unsettled is unsettling, like strange fruit swinging in the southern breeze. Yet the “quick justice” barbaric efficiency of slavery, the Confederacy, Jim Crow, superpredators, and mowing down unarmed Black men for traffic violations to name a few, are no examples of fairness. Faster isn’t always better, especially when it comes to justice. It takes time to gather facts, talk to witnesses, piece together the crimes and document them in an airtight way, brokering no doubt in the mind of a single jurist.

More efficiency topics

Areas I’ll be further exploring:

  • Slow thinking — Daniel Kahneman’s behavioral economics and cognition theory about slow and fast thinking systems in the brain, how they physiologically arose, and their implications for bias, decision making, geopolitics, and more.
  • Journey vs. Destination — It’s not just about getting to the same restaurant and eating the same thing. The end doesn’t always justify the means. Traveler vs. Tourist. Go with the flow. Roll with it, baby.
  • An ounce of caution — A stitch of time. He who makes haste makes waste. Don’t count your chickens before they hatch. Be careful!
  • Self-reflection — Thoughtfulness. Rumination. Mindfulness. Presence.
  • Being too busy speeds up time, not necessarily in a good way. Leads to the unexamined life, a Stoic no-no. Socrates would not approve, dude.
  • Enoughness — Sustainability. Patience. Non-violence. Whole-heartedness.
  • Hierarchy vs. Fairness — Consensus takes a lot longer. Dictators and monarchs are nothing if not efficient.
  • The appeal of fascism — History and ideology of the Nazis and their obsession with efficiency.
  • PR — soundbites. Simple narratives. Tropes, slogans, repetition.
  • Entertainment — intellectual empty calories. Neil Postman. McLuhan.
  • Automation — AI, bots, robotics, threats to labor
  • Walking vs. Transportation
  • The slow food movement
  • Speed reading
  • Speed runs — video games
Read more

For every thoughtful, measured perspective on the gigantically thorny problem of Diversity in the Valley, there has to be at least 10 angry white dudes who feel entitled to take a shit all over the idea that being more inclusive has to involve, like, actually learning to be inclusive — or really, making any changes at all.

There are “values” far more pressing than equality, they say — EFFICIENCY! ALPHA ELITISM! SHAVING OFF ANOTHER 5 MINUTES OF SOME FULL STACK ENGINEER’S TIME (by outsourcing it to someone poor who should feel lucky to have the opportunity to schlep around the dirty laundry and fetch the burritos of Today’s World-Saving Heroes — preferably someone brown) so that someone, somewhere else (outside of the Valley, one presumes) can do all the theoretical Morally Good activities that serve as the philosophical prop that is supposed to justify the tech industry’s frantic, breakneck pursuit of getting filthy fucking rich the mission critically important “time-saving efficiency” that has literally the rest of the world economy scrambling to catch up in its wake.

Ergo, in response to an interview with Slack engineer Erica Baker — whose 20% work-time role in contributing to company diversity strategy later in the thread apparently renders completely invisible her 80% role Writing Code with the Big Boys — this fellow feels he has an obligation to weigh in:

Yes, Kevin. TELL ME MORE about how I would be treated in an interview with you as hiring manager. One thing’s for sure, I could be completely confident that you lack a shred of skepticism about whether my qualifications make me “The Best” candidate in the self-fulfilling prophecy of your own perception.

Nevermind all the actual data that is finally beginning to show what the reality of nature already knows: DIVERSITY WINS. Being inclusive of a multiplicity of experience and perspective (which come along as a byproduct of the heuristic we can make use of — demographical appearance — as a rough approximate solution to our complete inability to objectively measure anything meaningful about the internal complexities of real people) makes companies stronger and more resilient.

Diversity makes companies moreantifragile by embracing the comparative disorder that is counterintuitive to the homogenous systems and societies we keep inanely trying to collectively build despite all the evidence of their abject failure throughout history. Our friend in Idaho is proof of this point: the dominant assumption that diversity definitionally reduces efficiency, thereby reducing profit.

Beyond being flat out wrong when you look at the data (which, curiously, diversity always seems to be a special case where otherwise ruthlessly data-driven engineers don’t dare to tread), this carries with it the hidden assumption which is the self-fulfilling prophecy that actually proves Erica’s point: the fundamental skepticism that people who aren’t white and male can possibly be The Best. That the only way they ever get a seat at the communal, lunch-ordered-by-bot-and-hand-delivered-by-poor-non-alpha-elite-coder-people table is by the magnanimous grace of some Do Gooder hiring manager or recruiter slavishly following regulatory orders from the government — and not by their own merit.

The plank in our own eyes

Part of this has to do with the historically definitional white male privilege that, for some reason, we’re still arguing about in our supposedly enlightened and modernized society whose blinders prevent the deep self-examination of our human past required to truly make progress. As if the human tendency to Other were somehow wiped away with the Emancipation Proclamation (1863) Fourteenth Amendment (1868) Brown v. Board of Education (1954) Civil Rights Act (1964) Voting Rights Act (1965) Loving v. Virginia (1967) Fair Housing Act (1968) Community Reinvestment Act (1977) end of theΒ carceral state (TKTK).

Having grown up a person saddled with two X chromosomes my whole life with almost no choice but to wrestle with this reality from every single angle intellectual and emotional, I at least finally understand the fundamental psychological biases that lead to this kind of abject refusal to deal with our own skewed perspectives — opting instead for ratcheting up ever more impressive shouting matches to peacock about how our dizzying intellectual prowess is surely proof enough of our obvious objectivity.

We are all wrong. And I’m no different.

I know that we desperately want to believe in our own superiority, both to everything that came before us throughout history (the “illusion of progress” we cultivate — despite no such guarantee existing in the natural world — only adds to this effect) and to our fellow humans. Elitism is the ultimate -ism.

It subsumes racism, sexism, religious fundamentalism, and all forms of tribalism that each have, at their roots, the core premise that whatever group I’ve chosen to join up with (or been allotted to by random lottery) is clearly and objectively The Best Group. It’s the undeniable tautology of naive realism that leaves us trapped in the pathetically, perennially distorted view that “I know best, and by the transitive property of awesome, all the groups I consider myself a part of are therefore clearly also The Best (else, why would I be part of them?!).” This automagically relegates all the groups with which we don’t identify to the bottom of the heap: obviously inferior, as anyone can see!

Combine this native human bias with the delirious modern cocktail of vicious neoliberalism and aggressive techno-utopian libertarianism, and it’s a formula in which People Who Don’t Appear White and Male are definitionally suspect because of the statistics we’re blanketed with ever day that tell us they are under-represented in fields like technology.

“If this is so,” says the mind of a brilliant and inarguably logical engineer, “it can only be because their Rugged Individualism hasn’t endowed them with the skills to pass muster. It’s a shame, really — at least Other People, somewhere else who care about human beings more than machine learning are concerned with this dilemma (so I don’t have to be: after all, I’m really fucking busy saving the world so STOP BOTHERING ME with this irrelevant claptrap distraction already! AND WHERE IS MY GODDAMN BURRITO?!?! It’s my Soylent off day!!!) — but honestly I have no choice but to treat The Next Brown or Curvy Data Point I See with some measure of statistical skepticism.”

Lack of diversity is a self-fulfilling prophecy

Therein lies the rub. When we take an observation about the “way things are” and leap to the moral conclusion that this is rightly so — that things ought to be this way, because clearly they are this way for some reason — we commit the logical fallacy that so consumed Hume: the idea that we can derive what ought to be from what is, also known as the fact/value problem.

I don’t think most white male engineers would go quite so far as to claim that their industry must remain homogenous to succeed (although clearly some do, like our friend Kevin, who apparently believes that diversity is definitionally both inefficient and a straight ticket to the business failure shitter — and that our only moral interest in the problem is spurred by the meddlesome interference of that old bugaboo The Government). Instead, in Silicon Valley it tends to take the form of justifying inaction: they might provisionally admit (over an artisanally-prepared, locally-sourced (from a Tenderloin window box herb garden) cocktail at Bar Crudo, or perhaps a Blue Bottle americano) that the problem of diversity may warrant some moral scrutiny, but not by them. They are just way too busy swimming for the shores of a Better World (so long as a Better World enriches them and their investors, natch) to be bothered with this issue that they perceive as not having the slightest effect on them. In times like these (which seems to be All Times), we simply can’t afford the moral luxury of anything but lifeboat ethics.

Right? Well, wrong — unless we’re not troubled by the absurd logical paradox of making ourselves subject to both the zero-sum philosophy this requires and the free market ideology of infinitely available value creation that is supposed to be driving the entire economic party bus (with karaoke) we’re riding in. So, we have to decide: which is it? Is there economic opportunity for all, or do the pathetic losers who fail to become startup founders get left at the curb? And if so, who will sing the songs of their people?!

Our own worst enemies

A reference to the old saw that “attitude is everything” is appropriate here. Because one of the few things more exasperating than the unexamined privilege of ignoring the issue is the endless infighting that those of us in marginalized groups do with each other over what the solution should be.

…where to even start? Let me explain… no, there is too much. Let me sum up: this comment from some random white dude who loves extreme sports begins and ends with the outrageously outsized entitlement of trying to tell Slack how to run its own goddamn business, from atop his lofty perch of Somewhere That Is Not Anywhere Even Remotely Near being an actual employee of Slack with some potentially arguable skin in the game, much less a leader or decision-maker within the company.

I mean, Jesus. This is what we’re dealing with. A worldview so vehemently opposed to the idea of apparently even discussing the matter of diversity (in case some terminology or phrase or godforsakenly challenging idea might be construed as controversial and somewhere, someone might possibly be offended — like the entire LGBT community he tries to lump me in with and in a follow-up comment — without a shred of irony! — attempts to claim he was only “speaking for himself” when demanding both a public apology and insinuating that Erica Baker the Slack engineer should literally lose her job for daring to state an opinion while black (p.s. we’ve truly come full fucking circle now, haven’t we?!)) that people feel compelled to spend their time offering free, unwarranted, and undoubtedly unwanted “business advice” to the company THAT PRESUMABLY KNOWS BETTER ABOUT WHAT IT IS DOING than Richard Fucking Burton The Third of His Name!

How can you even hold such a logical paradox in your head, much less lay it out in a single paragraph: the idea that somehow, bizarrely, Slack itself not only lacks the control over whether or not Erica Baker may be “let go for similar remarks” (I mean, who would be doing the firing in this case?! Is there some vigilante regulatory-required Anti-Social-Justice-Warrior in tights and a cape flying around Silicon Valley waiting for bat signals sent from comments on TechCrunch to swoop in from outside the company and authorize her termination?!), but may also be on such shaky ground from some available success metric (I assure you it’s not. It’s one of the few blindingly amazing success stories of recent memory and continues to be one of the fastest growing enterprise startups Of All Time) that they might just have to resort to taking the advice of some Totally Irrelevant Troll about what their fucking brand should be?!?

I. JUST. CAN’T. EVEN!!! (can you?! if so, better abandon all ye hope of ever working at Slack.)

Just goes to show: we’ll cling to whatever flimsy life raft of privilege we think we’re on, even as the Leaky Lifeboat (not to mention the Queen Friggin’ Mary) sails past, breathing a sigh of relief that we don’t seem eager to hop on and capsize it.

Everyone calm down. But be prepared to leave through the eastern gate

Let’s all dial down our Adderall drips for just one minute (but that’s all we can afford — the lifeboat awaits and all) and take a chill pill (feel free to take this as literally as you like). Do some soul-searching reflection, consult our Headspace apps, meditate in VR, or whatever the frak we need to do to enter the Tao Space.

Now let’s ask ourselves: if we believe we’re striving ever more harriedly toward a Better World, then what the heck does that world even look like? Close your eyes and picture it: what do you see? Are people happy in this world? Do they seem to go about their lives effortlessly and with graceful purpose in the human-connected face of god (for lack of a better term… so far), or are they still scurrying to and fro in the franticness of Trying To Get There?

Do people treat each other well, and with respect despite their differences, and in the face of overwhelming obstacles and risks we will have an impossible time solving from within isolated bunkers — or are they still spewing vitriol at each other over their gleefully intentional mischaracterizations of each other’s intentions?

Do they exhibit peace in the struggle, or are they still trying to shout each other down inside of every comment thread and social media exchange on the internet just to win a tiny provincial shadow of an urgently important argument about who has The Best Idea on how we can live in peace and harmony with each other, and how to impose it on the rest of those poor, lazy suckers who simply aren’t as gifted as the elite leaders who so grudgingly bear the wearisome heavy burden of Saving The World whilst being rewarded ever-so-handsomely with Real Non-Inflation Eaten Wages, lucrative stock options and liquidation preferences, artisanal cocktails, and Magically Appearing Burritos?

If we don’t even know what it looks like, then how will we know what values we should be working for, or recognize if and when we’ve arrived?

Read more