conspiracy theories

The concept of “prebunking” emerges as a proactive strategy in the fight against disinformation, an ever-present challenge in the digital era where information spreads at unprecedented speed and scale. In essence, prebunking involves the preemptive education of the public about the techniques and potential contents of disinformation campaigns before they encounter them. This method seeks not only to forewarn but also to forearm individuals, making them more resilient to the effects of misleading information.

Understanding disinformation

Disinformation, by definition, is false information that is deliberately spread with the intent to deceive or mislead. It’s a subset of misinformation, which encompasses all false information regardless of intent.

In our current “information age,” the rapid dissemination of information through social media, news outlets, and other digital platforms has amplified the reach and impact of disinformation campaigns. These campaigns can have various motives, including political manipulation, financial gain, or social disruption — and at times, all of the above; particularly in the case of information warfare.

The mechanism of prebunking

Prebunking works on the principle of “inoculation theory,” a concept borrowed from virology. Much like a vaccine introduces a weakened form of a virus to stimulate the immune system’s response to it, prebunking introduces individuals to a weakened form of an argument or disinformation tactic, thereby enabling them to recognize and resist such tactics in the future.

The process typically involves several key elements:

  • Exposure to Techniques: Educating people on the common techniques used in disinformation campaigns, such as emotional manipulation, conspiracy theories, fake experts, and misleading statistics.
  • Content Examples: Providing specific examples of disinformation can help individuals recognize similar patterns in future encounters.
  • Critical Thinking: Encouraging critical thinking and healthy skepticism, particularly regarding information sources and their motives. Helping people identify trustworthy media sources and discern credible sources in general.
  • Engagement: Interactive and engaging educational methods, such as games or interactive modules, have been found to be particularly effective in prebunking efforts.

The effectiveness of prebunking

Research into the effectiveness of prebunking is promising. Studies have shown that when individuals are forewarned about specific misleading strategies or the general prevalence of disinformation, they are better able to identify false information and less likely to be influenced by it. Prebunking can also increase resilience against disinformation across various subjects, from health misinformation such as the anti-vaccine movement to political propaganda.

However, the effectiveness of prebunking can vary based on several factors:

  • Timing: For prebunking to be most effective, it needs to occur before exposure to disinformation. Once false beliefs have taken root, they are much harder to correct — due to the backfire effect and other psychological, cognitive, and social factors.
  • Relevance: The prebunking content must be relevant to the audience’s experiences and the types of disinformation they are likely to encounter.
  • Repetition: Like many educational interventions, the effects of prebunking can diminish over time, suggesting that periodic refreshers may be necessary.

Challenges and considerations

While promising, prebunking is not a panacea for the disinformation dilemma. It faces several challenges:

  • Scalability: Effectively deploying prebunking campaigns at scale, particularly in a rapidly changing information environment, is difficult.
  • Targeting: Identifying and reaching the most vulnerable or targeted groups before they encounter disinformation requires sophisticated understanding and resources.
  • Adaptation by Disinformers: As prebunking strategies become more widespread, those who spread disinformation may adapt their tactics to circumvent these defenses.

Moreover, there is the ethical consideration of how to prebunk without inadvertently suppressing legitimate debate or dissent, ensuring that the fight against disinformation does not become a vector for censorship.

The role of technology and media

Given the digital nature of contemporary disinformation campaigns, technology companies and media organizations play a crucial role in prebunking efforts. Algorithms that prioritize transparency, the promotion of factual content, and the demotion of known disinformation sources can aid in prebunking. Media literacy campaigns, undertaken by educational institutions and NGOs, can also equip the public with the tools they need to navigate the information landscape critically.

Prebunking represents a proactive and promising approach to mitigating the effects of disinformation. By educating the public about the tactics used in disinformation campaigns and fostering critical engagement with media, it’s possible to build a more informed and resilient society.

However, the dynamic and complex nature of digital disinformation means that prebunking must be part of a broader strategy that includes technology solutions, regulatory measures, and ongoing research. As we navigate this challenge, the goal remains clear: to cultivate an information ecosystem where truth prevails, and public discourse thrives on accuracy and integrity.

Read more

conspiracy theories, disinformation, and fake news

Conspiracy Theory Dictionary: From QAnon to Gnostics

In half a decade we’ve gone from Jeb Bush making a serious run for president to Marjorie Taylor Greene running unopposed and winning a House seat in Georgia. QAnon came seemingly out of nowhere, but taps into a much deeper and older series of conspiracy theories that have surfaced, resurfaced, and been remixed throughout time.

Essentially, QAnon is a recycling of the Protocols of the Elders of Zion conspiracy theory that drove the Nazi ideology and led to the genocide of over 6 million Jews, gypsies, gays, and others who made Hitler mad. It’s a derivative of the global cabal conspiracy theory, and is riddled with the kind of conspiratorial paranoia that led to the deaths of over 75 million people in World War II.

The spread of the QAnon conspiracy theory greatly benefits from historical memory, getting a generous marketing boost from sheer familiarity. It also benefits from an authoritarian mentality growing louder in America, with a predilection for magical thinking and a susceptibility to conspiratorial thinking.

conspiracy theories, by midjourney

Tales as old as time

Conspiracy theories have been around much longer even than the Protocols — stretching back about as long as recorded history itself. Why do people believe in conspiracy theories? In an increasingly complex world brimming with real-time communication capabilities, the cognitive appeal of easy answers may simply be stronger than ever before.

Anthropologists believe that conspiracy theory has been around for about as long as human beings have been able to communicate. Historians describe one of the earliest conspiracy theories as originating in ancient Mesopotamia, involving a god named Marduk and a goddess called Tiamat — both figures in Babylonian creation mythology.

According to the myth, Marduk defeated Tiamat in battle and created the world from her body — but some ancient Mesopotamians at the time thought that the story was not actually a mere myth, but a political cover-up of a real-life conspiracy in which the followers of Marduk secretly plotted to overthrow Tiamat to seize power.

This “original conspiracy theory” was likely driven by political tensions between city-states in ancient Mesopotamia, although there are very few written records still around to corroborate the origin of the theory or perception of the story at the time. Nevertheless, the Marduk-Tiamat myth is regarded as one of the earliest known examples of widespread belief in conspiracy theories, and it points to the relative commonality and frequency of false narratives throughout history.

Whether deployed purposefully to deceive a population for political advantage, created to exploit people economically, or invented “naturally” as a simple yet satisfying explanation for otherwise complicated and overwhelming phenomena, conspiracy theories are undoubtedly here to stay in culture more broadly for some time to come. We had best get the lay of the land, and understand the language we might use to describe and talk about them.

conspiracy theories: old men around the world map, by midjourney

Conspiracy Theory Dictionary

4chanA notorious internet message board with an unruly culture capable of trolling, pranks, and crimes.
8chanIf 4chan wasn’t raw and lawless enough for you, you could try the even more right-wing “free speech”-haven 8chan while it still stood (now 8kun). Described by its founder Frederick Bennan as “if 4chan and reddit had a baby,” the site is notorious for incubating Gamergate, which morphed into PizzaGate, which morphed into QAnon — and for generally being a cesspool of humanity’s worst stuff.
9/11 truthersPeople who believe the attacks on the Twin Towers in New York City in 2001 were either known about ahead of time and allowed to happen, or were intentionally planned by the US government.
alien abductionPeople who claim to have been captured by intelligent life from another planet, taken to a spaceship or other plane of existence, and brought back — as well as the folks who believe them.
American carnageEvocative of “immense loss” in the Nazi mythology
AntifaAntifa is anti-fascism, so the anti-anti-fascists are just fascists wrapped in a double negative. They are the real cancel culture — and a dangerous one (book burning and everything!).
Anti-SemitismOne of history’s oldest hatreds, stretching back to early biblical times
Biblical inerrancyBiblical inerrancy is the doctrine that the Bible, in its original manuscripts, is without error or fault in all its teachings. 
birtherismOne of Donald Trump‘s original Big Lies — that President Barack Obama wasn’t born in the U.S. and therefore, wasn’t a “legitimate” president.
Black Lives MatterA social justice movement advocating for non-violent civil disobedience in protest against incidents of police brutality and all racially motivated violence against black people.
blood libelA false accusation or myth that Jewish people used the blood of Christians, especially children, in religious rituals, historically used to justify persecution of Jews.
child traffickingThe illegal practice of procuring or trading children for the purpose of exploitation, such as forced labor, sexual exploitation, or illegal adoption.
Christian IdentityA religious belief system that asserts that white people of European descent are God’s chosen people, often associated with white supremacist and extremist groups.
climate change denialThe rejection or dismissal of the scientific consensus that the climate is changing and that human activity is a significant contributing factor. Part of a broader cultural trend of science denialism.
The ConfederacyRefers to the Confederate States of America, a group of 11 southern states that seceded from the United States in 1861, leading to the American Civil War, primarily over the issue of slavery.
contaminationThe presence of an unwanted substance or impurity in another substance, making it unsafe or unsuitable for use.
cosmopolitanismAnother term for globalist or internationalist, which are all dog whistles for Jewish people (see also: global cabal, blood libel)
Crossing the RubiconA phrase that signifies passing a point of no return, derived from Julius Caesar’s irreversible crossing of the Rubicon River in 49 BC, leading to the Roman Civil War.
cultural MarxismAnti-semitic conspiracy theory alleging that Jewish intellectuals who fled the Hitler regime were responsible for infecting American culture with their communist takeover plans and that this holy war is the war the right-wing fights each day.
deep stateThe idea of a body within the government and military that operates independently of elected officials, often believed to manipulate government policy and direction.
DVE(Domestic Violent Extremism): Refers to violent acts committed within a country’s borders by individuals motivated by domestic political, religious, racial, or social ideologies.
fake newsInformation that is false or misleading, created and disseminated with the intent to deceive the public or sway public opinion.
GamerGateA controversy that started in 2014 involving the harassment of women in the video game industry, under the guise of advocating for ethics in gaming journalism.
George SorosA Hungarian-American billionaire investor and philanthropist, often the subject of unfounded conspiracy theories alleging he manipulates global politics and economies.
HollywoodThe historic center of the United States film industry, often used to refer broadly to American cinema and its cultural influence.
IlluminatiA term often associated with various conspiracy theories that allege a secret society controlling world affairs, originally referring to the Bavarian Illuminati, an Enlightenment-era secret society.
InfoWarsA controversial far-right media platform known for promoting conspiracy theories, disinformation, and misinformation, hosted by clinical narcissist Alex Jones.
JFK assassinationThe assassination of President John F. Kennedy on November 22, 1963, in Dallas, Texas, an event surrounded by numerous conspiracy theories regarding the motives and identities of the assassins.
John Birch SocietyThe QAnon of its day (circa 1960s), this extreme right-wing group was theoretically about anti-communist ideals but espoused a host of conspiracy theories and outlandish beliefs.
lamestream mediaDerogatory term for any media that isn’t right-wing media.
leftist apocalypseA hyperbolic term used by some critics to describe a scenario where leftist or progressive policies lead to societal collapse or significant negative consequences.
Makers and TakersA right-wing economic dichotomy used to describe individuals or groups who contribute to society or the economy (makers) versus those who are perceived to take from it without contributing (takers). See also: Mudsill Theory, trickle down economics, supply side economics, Reaganomics, Libertarianism
micro-propaganda machineMPM: Refers to the use of targeted, small-scale dissemination of propaganda, often through social media and other digital platforms, to influence public opinion or behavior.
motivated reasoningThe cognitive process where individuals form conclusions that are more favorable to their preexisting beliefs or desires, rather than based on objective evidence.
New World OrderA conspiracy theory that posits a secretly emerging totalitarian world government, often associated with fears of loss of sovereignty and individual freedoms. (see also, OWG, ZOG)
nullificationA constitutional “theory” put forth by southern states before the Civil War that they have the power to invalidate any federal laws or judicial decisions they consider unconstitutional. It’s never been upheld by the federal courts.
One World GovernmentThe concept of a single government authority that would govern the entire world, often discussed in the context of global cooperation or, conversely, as a dystopian threat in conspiracy theories. (see also: NWO, ZOG)
PizzaGateA debunked and baseless conspiracy theory alleging the involvement of certain U.S. political figures in a child sex trafficking ring, supposedly operated out of a Washington, D.C., pizzeria.
post-truthRefers to a cultural and political context in which debate is framed largely by appeals to emotion disconnected from the details of policy, and by the repeated assertion of talking points to which factual rebuttals are ignored.
PRpublic relations
propagandaInformation, especially of a biased or misleading nature, used to promote a political cause or point of view.
Protocols of the Elders of ZionForged anti-semitic document alleging a secret Jewish child murder conspiracy used by Hitler to gin up support for his regime.
PsyOpsPsychological operations: Operations intended to convey selected information and indicators to audiences to influence their emotions, motives, objective reasoning, and ultimately the behavior of governments, organizations, groups, and individuals. Used as part of hybrid warfare and information warfare tactics in geopolitical (and, sadly, domestic) arenas.
QAnonA baseless conspiracy theory alleging that a secret cabal of Satan-worshipping pedophiles is running a global child sex-trafficking ring and plotting against former U.S. President Donald Trump.
Q DropsMessages or “drops” posted on internet forums by “Q,” the anonymous figure at the center of the QAnon conspiracy theory, often cryptic and claiming to reveal secret information about a supposed deep state conspiracy.
reactionary modernismA term that describes the combination of modern technological development with traditionalist or reactionary political and cultural beliefs, often seen in fascist ideologies.
Reichstag fireAn arson attack on the Reichstag building (home of the German parliament) in Berlin on February 27, 1933, which the Nazi regime used as a pretext to claim that Communists were plotting against the German government.
RothschildsA wealthy Jewish family of bankers, often subject to various unfounded conspiracy theories alleging they control global financial systems and world events.
sock puppetsOnline identities used for purposes of deception, such as to praise, defend, or support a person or organization while appearing to be an independent party.
“Stand back and stand by”A phrase used by former U.S. President Donald Trump during a presidential debate, which was interpreted as a call to readiness by the Proud Boys, a far-right and neo-fascist organization that seemed to answer his calling during the riot and coup attempt at the Capitol on January 6, 2021.
The StormWithin the context of QAnon, a prophesied event in which members of the supposed deep state cabal will be arrested and punished for their crimes.
WikiLeaksWikiLeaks is a controversial platform known for publishing classified and secret documents from anonymous sources, gaining international attention for its major leaks. While it has played a significant role in exposing hidden information, its release of selectively edited materials has also contributed to the spread of conspiracy theories related to American and Russian politics.
ZOGZOG (Zionist Occupation Government): A conspiracy theory claiming that Jewish people secretly control a country, particularly the United States, while the term itself is antisemitic and unfounded.
Read more

The concept of a “confirmation loop” in psychology is a critical element to understand in the contexts of media literacy, disinformation, and political ideologies. It operates on the basic human tendency to seek out, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses, known as confirmation bias. This bias is a type of cognitive bias and a systematic error of inductive reasoning that affects the decisions and judgments that people make.

Understanding the confirmation loop

A confirmation loop occurs when confirmation bias is reinforced in a cyclical manner, often exacerbated by the selective exposure to information that aligns with one’s existing beliefs. In the digital age, this is particularly prevalent due to the echo chambers created by online social networks and personalized content algorithms.

These technologies tend to present us with information that aligns with our existing views, thus creating a loop where our beliefs are constantly confirmed, and alternative viewpoints are rarely encountered. This can solidify and deepen convictions, making individuals more susceptible to disinformation and conspiracy theories, and less tolerant of opposing viewpoints.

Media literacy and disinformation

Media literacy is the ability to identify different types of media and understand the messages they’re sending. It’s crucial in breaking the confirmation loop as it involves critically evaluating sources of information, their purposes, and their impacts on our thoughts and beliefs.

With the rise of digital media, individuals are bombarded with an overwhelming amount of information, making it challenging to distinguish between credible information and disinformation. It is paramount to find your own set of credible sources, and verify the ethics and integrity of new sources you come across.

Disinformation, or false information deliberately spread to deceive people, thrives in an environment where confirmation loops are strong. Individuals trapped in confirmation loops are more likely to accept information that aligns with their preexisting beliefs without scrutinizing its credibility. This makes disinformation a powerful tool in manipulating public opinion, especially in politically charged environments.

Political ideologies

The impact of confirmation loops on political ideologies cannot be overstated. Political beliefs are deeply held and can significantly influence how information is perceived and processed.

When individuals only consume media that aligns with their political beliefs, they’re in a confirmation loop that can reinforce partisan views and deepen divides. This is particularly concerning in democratic societies where informed and diverse opinions are essential for healthy political discourse.

Operation of the confirmation loop

The operation of the confirmation loop can be seen in various everyday situations. For instance, a person might exclusively watch news channels that reflect their political leanings, follow like-minded individuals on social media, and participate in online forums that share their viewpoints.

Algorithms on many platforms like Facebook and Twitter (X) detect these preferences and continue to feed similar content, thus reinforcing the loop. Over time, this can result in a narrowed perspective, where alternative viewpoints are not just ignored but may also be actively discredited or mocked.

Becoming more aware and breaking the loop

Becoming more aware of confirmation loops and working to break them is essential for fostering open-mindedness and reducing susceptibility to disinformation. Here are several strategies to achieve this:

  1. Diversify Information Sources: Actively seek out and engage with credible sources of information that offer differing viewpoints. This can help broaden your perspective and challenge your preconceived notions.
  2. Critical Thinking: Develop critical thinking skills to analyze and question the information you encounter. Look for evidence, check sources, and consider the purpose and potential biases behind the information.
  3. Media Literacy Education: Invest time in learning about media literacy. Understanding how media is created, its various forms, and its impact can help you navigate information more effectively.
  4. Reflect on Biases: Regularly reflect on your own biases and consider how they might be affecting your interpretation of information. Self-awareness is a crucial step in mitigating the impact of confirmation loops.
  5. Engage in Constructive Dialogue: Engage in respectful and constructive dialogues with individuals who hold different viewpoints. This can expose you to new perspectives and reduce the polarization exacerbated by confirmation loops.

The confirmation loop is a powerful psychological phenomenon that plays a significant role in shaping our beliefs and perceptions, especially in the context of media literacy, disinformation, and political ideologies. By understanding how it operates and actively working to mitigate its effects, individuals can become more informed, open-minded, and resilient against disinformation.

The path toward breaking the confirmation loop involves a conscious effort to engage with diverse information sources, practice critical thinking, and foster an environment of open and respectful discourse.

Read more

The concept of ego defenses, also known simply as defense mechanisms, is fundamental in the field of psychology, particularly within the psychoanalytic framework established by Sigmund Freud and further developed by his daughter Anna Freud and other psychoanalysts. These mechanisms are subconscious safeguards that protect individuals from anxiety and the awareness of internal or external dangers or stressors.

Understanding ego defense mechanisms

Ego defenses operate at a psychological level to help manage the conflicts between internal impulses and external reality. They often work by distorting, transforming, or somehow denying reality. While these mechanisms can vary widely in terms of their sophistication and the level of distortion they involve, all serve the primary function of reducing emotional distress.

Some common defense mechanisms include:

  • Denial: Refusing to accept reality because it is too painful or difficult to face.
  • Repression: Unconsciously blocking unacceptable thoughts or desires from consciousness.
  • Projection: Attributing one’s own unacceptable thoughts or feelings to others.
  • Rationalization: Creating a seemingly logical reason for behavior that might otherwise be shameful.
  • Displacement: Redirecting emotions from a ‘dangerous’ object to a ‘safe’ one.
  • Regression: Reverting to behavior characteristic of an earlier stage of development when confronted with stress.

These mechanisms aren’t inherently bad; they can be essential for coping with stress and can be adaptive in many circumstances. However, when overused or used inappropriately, they can lead to unhealthy patterns and psychological distress.

Ego defense mechanisms and disinformation

When it comes to disinformation, conspiracy theories, and extremist ideologies, ego defenses play a crucial role in how individuals process and react to information that conflicts with their existing beliefs or worldviews. This intersection is particularly apparent in the phenomena of denial, projection, and rationalization.

  1. Denial comes into play when individuals refuse to accept verified facts because these facts are uncomfortable or threatening to their pre-existing views or sense of self. For example, someone might deny the impacts of climate change because acknowledging it would necessitate uncomfortable changes in their lifestyle or worldview.
  2. Projection is evident when individuals attribute malicious intent or undesirable traits to others rather than recognizing them in themselves. In the realm of conspiracy theories, this can manifest as accusing various groups or organizations of conspiring for control, thereby projecting one’s own feelings of vulnerability or distrust.
  3. Rationalization allows individuals to justify belief in disinformation or extremist ideologies by providing reasonable but false explanations for these beliefs. This can often involve elaborate justifications for why certain pieces of disinformation fit into their broader understanding of the world, despite clear evidence to the contrary.

The psychological appeal of extremist ideologies

Extremist ideologies often provide a sense of certainty, control, and identity, all of which are deeply appealing on a psychological level, particularly for individuals feeling disconnected or powerless. These ideologies can effectively reduce psychological discomfort by providing simple, albeit inaccurate, explanations for complex social or personal issues.

How ego defenses facilitate belief in extremist ideologies

Ego defenses facilitate adherence to extremist ideologies by allowing individuals to:

  • Avoid cognitive dissonance: Maintaining a consistent belief system, even if it’s flawed, helps avoid the discomfort of conflicting beliefs.
  • Feel part of a group: Aligning with a group that shares one’s defensive strategies can reinforce a sense of belonging and identity.
  • Displace emotions: Directing negative emotions towards ‘out-groups’ or perceived enemies rather than dealing with personal issues or societal complexities.

Ego defenses keep false beliefs “sticky”

Ego defenses are not only fundamental to personal psychological functioning but also play a significant role in how people interact with and are influenced by broader societal narratives. Understanding the role of these mechanisms in the context of disinformation, conspiracy theories, and extremist ideologies is crucial for addressing these issues effectively. This understanding helps illuminate why such beliefs are appealing and resistant to change, highlighting the need for approaches that address underlying psychological needs and defenses.

Knowing the power of ego defenses helps explain why we shouldn’t expect people to part with their strongly-held false beliefs based on simple exposure to actual facts or corrective information — there is often something much deeper going on. In fact, confronting a conspiracy theorist or extremist with contradictory facts or information can often lead to a backfire effect, where the individual comes away more strongly committed to their false beliefs than they were before.

Read more

Stochastic terrorism is a term that has emerged in the lexicon of political and social analysis to describe a method of inciting violence indirectly through the use of mass communication. This concept is predicated on the principle that while not everyone in an audience will act on violent rhetoric, a small percentage might.

The term “stochastic” refers to a process that is randomly determined; it implies that the specific outcomes are unpredictable, yet the overall distribution of these outcomes follows a pattern that can be statistically analyzed. In the context of stochastic terrorism, it means that while it is uncertain who will act on incendiary messages and violent political rhetoric, it is almost certain that someone will.

The nature of stochastic terrorism

Stochastic terrorism involves the dissemination of public statements, whether through speeches, social media, or traditional media, that incite violence. The individuals or entities spreading such rhetoric may not directly call for political violence. Instead, they create an atmosphere charged with tension and hostility, suggesting that action must be taken against a perceived threat or enemy. This indirect incitement provides plausible deniability, as those who broadcast the messages can claim they never explicitly advocated for violence.

Prominent stochastic terrorism examples

The following are just a few notable illustrative examples of stochastic terrorism:

  1. The Oklahoma City Bombing (1995): Timothy McVeigh, influenced by extremist anti-government rhetoric, the 1992 Ruby Ridge standoff, and the 1993 siege at Waco, Texas, detonated a truck bomb outside the Alfred P. Murrah Federal Building, killing 168 people. This act was fueled by ideologies that demonized the federal government, highlighting how extremism and extremist propaganda can inspire individuals to commit acts of terror.
  2. The Oslo and Utøya Attacks (2011): Anders Behring Breivik, driven by anti-Muslim and anti-immigrant beliefs, bombed government buildings in Oslo, Norway, then shot and killed 69 people at a youth camp on the island of Utøya. Breivik’s manifesto cited many sources that painted Islam and multiculturalism as existential threats to Europe, showing the deadly impact of extremist online echo chambers and the pathology of right-wing ideologies such as Great Replacement Theory.
  3. The Pittsburgh Synagogue Shooting (2018): Robert Bowers, influenced by white supremacist ideologies and conspiracy theories about migrant caravans, killed 11 worshippers in a synagogue. His actions were preceded by social media posts that echoed hate speech and conspiracy theories rampant in certain online communities, demonstrating the lethal consequences of unmoderated hateful rhetoric.
  4. The El Paso Shooting (2019): Patrick Crusius targeted a Walmart in El Paso, Texas, killing 23 people, motivated by anti-immigrant sentiment and rhetoric about a “Hispanic invasion” of Texas. His manifesto mirrored language used in certain media and political discourse, underscoring the danger of using dehumanizing language against minority groups.
  5. Christchurch Mosque Shootings (2019): Brenton Tarrant live-streamed his attack on two mosques in Christchurch, New Zealand, killing 51 people, influenced by white supremacist beliefs and online forums that amplified Islamophobic rhetoric. The attacker’s manifesto and online activity were steeped in extremist content, illustrating the role of internet subcultures in radicalizing individuals.

Stochastic terrorism in right-wing politics in the US

In the United States, the concept of stochastic terrorism has become increasingly relevant in analyzing the tactics employed by certain right-wing entities and individuals. While the phenomenon is not exclusive to any single political spectrum, recent years have seen notable instances where right-wing rhetoric has been linked to acts of violence.

The January 6, 2021, attack on the U.S. Capitol serves as a stark example of stochastic terrorism. The event was preceded by months of unfounded claims of electoral fraud and calls to “stop the steal,” amplified by right-wing media outlets and figures — including then-President Trump who had extraordinary motivation to portray his 2020 election loss as a victory in order to stay in power. This rhetoric created a charged environment, leading some individuals to believe that violent action was a justified response to defend democracy.

The role of media and technology

Right-wing media platforms have played a significant role in amplifying messages that could potentially incite stochastic terrorism. Through the strategic use of incendiary language, disinformation, misinformation, and conspiracy theories, these platforms have the power to reach vast audiences and influence susceptible individuals to commit acts of violence.

The advent of social media has further complicated the landscape, enabling the rapid spread of extremist rhetoric. The decentralized nature of these platforms allows for the creation of echo chambers where inflammatory messages are not only amplified but also go unchallenged, increasing the risk of radicalization.

Challenges and implications

Stochastic terrorism presents significant legal and societal challenges. The indirect nature of incitement complicates efforts to hold individuals accountable for the violence that their rhetoric may inspire. Moreover, the phenomenon raises critical questions about the balance between free speech and the prevention of violence, challenging societies to find ways to protect democratic values while preventing harm.

Moving forward

Addressing stochastic terrorism requires a multifaceted approach. This includes promoting responsible speech among public figures, enhancing critical thinking and media literacy among the public, and developing legal and regulatory frameworks that can effectively address the unique challenges posed by this form of terrorism. Ultimately, combating stochastic terrorism is not just about preventing violence; it’s about preserving the integrity of democratic societies and ensuring that public discourse does not become a catalyst for harm.

Understanding and mitigating the effects of stochastic terrorism is crucial in today’s increasingly polarized world. By recognizing the patterns and mechanisms through which violence is indirectly incited, societies can work towards more cohesive and peaceful discourse, ensuring that democracy is protected from the forces that seek to undermine it through fear and division.

Read more

The war in Ukraine is less “surprising” to some who’ve seen it raging since 2014. Although it escalated greatly in 2022, the Ukraine timeline dates back all the way to the collapse of the Soviet Union in 1991.

To understand the backstory — which is now inextricably intertwined with our own presidential history given the impeachment of Donald Trump over his phone calls with Zelensky to the Republican Party‘s current-day support of the aims of Vladimir Putin — we have to go back to a time when no one was stronger on anti-Russian policy than GOP darling Ronald Reagan.

  • 1991 — Ukraine declares independence and becomes an independent entity after the fall of the Soviet Union
  • 1994 — Ukraine agrees to give up its nuclear arsenal in exchange for a protection agreement with Russia, United States, Britain, and Ireland (Budapest Memorandum)
  • 2004Viktor Yanukovich “wins” election under dubious circumstances and is deposed for a do over election, which he loses to Viktor Yuschenko (Orange Revolution)
  • 2006 — Viktor Yanukovych begins working directly with Paul Manafort, in an effort to boost his image after his electoral loss. Manafort was known for his work helping the “Torturers’ Lobby” of brutal dictators around the world in the 1980s, with Roger Stone (another infamous dirty trickster best known for his role as a fixer for Richard Nixon).
  • 2007 — Yanukovych’s Party of Regions does well in the Ukranian parliamentary elections, gaining a large number of seats credited to Manafort’s strategic advice about Western-style campaigning.
  • 2010 — Yanukovych is elected President of Ukraine, again largely crediting Manafort’s strategies for his comeback.
  • Nov 2013 — Having promised a more European-style government in order to win the presidency in 2010, Yanukovych turned on his word and initiated more pro-Russian policies than the Ukranians had signed up for. Yanukovych is now beset by enormous public protests against the corruption of his regime, and his unilateral decision to abandon an association agreement with the EU in favor of a trade agreement with Russia (Maidan Revolution / Revolution of Dignity)
  • Feb 2014 — After a harrowing 93 days barricaded inside Kyiv’s Maidan Square, activists are victorious; Yanukovich is deposed and flees to Russia
  • Mar 2014 — Russian forces invade and annex the region of Crimea within Ukraine
  • Apr 2014 — Russian forces invade the Donetsk and Luhansk regions in eastern Ukraine, escalating a war that continues to this day and had already killed more than 14,000 people by the time the 2022 large scale invasion began
  • Apr 2014 — Hunter Biden and business partner Devon Archer join the board of Burisma
  • May 2014 — Candy magnate Petro Poroshenko succeeds Yanukovych as president of Ukraine
  • Feb 10, 2015Viktor Shokin takes office as the prosecutor general of Ukraine, tasked with getting a handle on rampant corruption
  • Oct 8, 2015 — US Assistant Secretary of State Victoria Nuland reiterates strong concerns that Shokin is failing to prosecute obvious corruption in Ukraine, and that efforts at anti-corruption must be stepped up there
  • Dec 8, 2015 — Then VP and point person on Ukraine Joe Biden gave a speech to the Ukrainian parliament, urging them to step up their efforts to pursue anti-corruption reforms to help strengthen their young democracy
  • Winter 2015-6 — Biden is talking to Poroshenko about how Shokin is slow-walking their anti-corruption efforts
  • Feb 16, 2016 — Viktor Shokin resigns as Prosecutor General of Ukraine
  • May 12, 2016Yuriy Lutsenko is appointed as the new Prosecutor General, despite having no law degree or legal experience. At first he takes a hard line against Burisma.
  • Aug 14, 2016 — “Black ledger” payments to Paul Manafort from Viktor Yanukovych go public
  • May 10, 2017Trump hosts Russian Foreign Minister Sergei Lavrov and Ambassador Sergey Kislyak in the Oval Office, the day after he has fired James Comey as the Director of the FBI over “the Russian thing” — only a photographer for Russian news agency Tass is allowed to cover the meeting
  • June 2017 — The NotPetya malware emerges and causes extensive damage — especially in Ukraine. It is widely fingerprinted as a Russian state-sponsored attack.
  • October 30, 2017 — Paul Manafort is indicted by Special Counsel Robert Mueller for money laundering, acting as a foreign agent, making false statements, and conspiracy against the United States, as part of the ongoing investigation into Russian interference in the 2016 US presidential election.
  • Apr 30, 2018 — At a Trump dinner in his DC hotel, Lev Parnas and Igor Fruman tell Trump they think Ukraine Ambassador Yovanovitch isn’t loyal enough to him
  • May-June 2018 — Lev Parnas pressures US Congressman Pete Sessions to pressure Trump to fire Yovanovitch in exchange for campaign funding; he and Fruman are later arrested for this scheme and other federal charges of illegal foreign funding of election campaigns
  • Summer 2018 — Trump reportedly frets a potential Biden run for the presidency
  • August 2018 — Lev Parnas’s company, which is named (I kid you not) “Fraud Guarantee,” hires Rudy Giuliani‘s firm for $500,000 to continue working on getting Ambassador Yovanovitch fired for doing her job pursuing corruption in Ukraine.
  • Sept 2018Congress passes and Trump signs a spending bill for the Department of Defense, including $250 million in military aid to Ukraine under the Ukraine Security Assistance Initiative (USAI)
  • Late 2018 — Lev Parnas arranges for Giuliani to meet with both Shokin and Lutsenko on multiple occasions; Devin Nunes also secretly meets with Shokin in Vienna.
  • Dec 6, 2018 — Trump pressures Parnas and Fruman to pressure the Ukrainian government to open an investigation into the Bidens
  • Late Feb, 2019 — Parnas and Fruman pressure then-President Poroshenko to open an investigation into the Bidens, in exchange for a state visit at the White House that would help his challenging re-election campaign against the popular young upstart comedian Volodymyr Zelenskyy
  • Spring 2019 — A “working group” of Giuliani, Parnas, Fruman, conservative Hill reporter John Solomon, Joseph diGenova, Victoria Toensing, and Devin Nunes’s top aide Harvey meet regularly to work on the quid pro quo project
  • March 2019 — Prosecutor General Lutsenko opens 2 investigations: 1 into alleged Ukrainian involvement in the 2016 US election (a Russian conspiracy theory) and a 2nd into Hunter Biden’s involvement with Burisma (he will later retract many of his allegations).
  • March 24, 2019 — Don Jr. tweets criticism of Ambassador Yovanovitch
  • March 28, 2019 — Giuliani hands off a smear campaign packet of disinformation cobbled together on Yovanovitch, intended for Secretary of State Mike Pompeo
  • April 24, 2019 — Trump orders Marie Yovanovitch recalled from her diplomatic mission in Ukraine, after Giuliani and other allies reported she was undermining and obstructing their efforts to extort Ukrainian president Volodomyr Zelensky to claim he was investigating the Bidens for corruption.
  • July 25, 2019 — On a phone call with Zelensky, Trump pressures him to investigate Biden in exchange for the release of funds to keep the Russians at bay in Crimea. He disparages Yovanovitch on the call, referring to her as “bad news.”
  • Oct 3, 2019 — Ambassador to Ukraine Marie Yovanovitch is unsummarily fired by Donald Trump after recently having been invited to continue her post for several more years
  • Dec 18, 2019 — The House of Representatives votes to impeach Donald Trump for abuse of power and obstruction of Congress, the first of two times Trump will be impeached.
  • Feb 5, 2020 — The Republican-controlled Senate voted along party lines, having called no witnesses, to acquit Donald Trump of both impeachment charges.
  • Feb 2022 — Russian forces begin a large scale land invasion of Ukraine including massive attacks on civilian cities.
  • Feb 2024 — Donald Trump holds up a bipartisan immigration deal in Congress that would allow military aide funds to Ukraine to be released. Running for a second term as US President, Trump continues to break with 80 years of the post-WWII international order — in refusing to support NATO, the alliance widely regarded as keeping the peace in Europe broadly, as well as in supporting the regime of Vladimir Putin in Russia’s war of aggression against Ukraine.
Read more

The adrenochrome conspiracy theory is a complex and widely debunked claim that has its roots in various strands of mythology, pseudoscience, disinformation, and misinformation. It’s important to approach this topic with a critical thinking perspective, understanding that these claims are not supported by credible evidence or scientific understanding.

Origin and evolution of the adrenochrome theory

The origin of the adrenochrome theory can be traced back to the mid-20th century, but it gained notable prominence in the context of internet culture and conspiracy circles in the 21st century. Initially, adrenochrome was simply a scientific term referring to a chemical compound produced by the oxidation of adrenaline. However, over time, it became entangled in a web of conspiracy theories.

In fiction, the first notable reference to adrenochrome appears in Aldous Huxley’s 1954 work “The Doors of Perception,” where it’s mentioned in passing as a psychotropic substance. Its more infamous portrayal came with Hunter S. Thompson’s 1971 book “Fear and Loathing in Las Vegas,” where adrenochrome is depicted as a powerful hallucinogen. These fictional representations played a significant role in shaping the later conspiracy narratives around the substance.

The conspiracy theory, explained

The modern adrenochrome conspiracy theory posits that a global elite, often linked to high-profile figures in politics, entertainment, and finance, harvests adrenochrome from human victims, particularly children. According to the theory, this substance is used for its supposed anti-aging properties or as a psychedelic drug.

This theory often intertwines with other conspiracy theories, such as those related to satanic ritual abuse and global cabal elites. It gained significant traction on internet forums and through social media, particularly among groups inclined towards conspiratorial thinking. Adrenochrome theory fundamentally contains antisemitic undertones, given its tight similarity with the ancient blood libel trope — used most famously by the Nazi regime to indoctrinate ordinary Germans into hating the Jews.

Lack of scientific evidence

From a scientific perspective, adrenochrome is a real compound, but its properties are vastly different from what the conspiracy theory claims. It does not have hallucinogenic effects, nor is there any credible evidence to suggest it possesses anti-aging capabilities. The scientific community recognizes adrenochrome as a byproduct of adrenaline oxidation with limited physiological impact on the human body.

Impact and criticism

The adrenochrome conspiracy theory has been widely criticized for its baseless claims and potential to incite violence and harassment. Experts in psychology, sociology, and information science have pointed out the dangers of such unfounded theories, especially in how they can fuel real-world hostility and targeting of individuals or groups.

Furthermore, the theory diverts attention from legitimate issues related to child welfare and exploitation, creating a sensationalist and unfounded narrative that undermines genuine efforts to address these serious problems.

Psychological and social dynamics

Psychologists have explored why people believe in such conspiracy theories. Factors like a desire for understanding in a complex world, a need for control, and a sense of belonging to a group can drive individuals towards these narratives. Social media algorithms and echo chambers further reinforce these beliefs, creating a self-sustaining cycle of misinformation.

Various legal and social actions have been taken to combat the spread of the adrenochrome conspiracy and similar misinformation. Platforms like Facebook, Twitter, and YouTube have implemented policies to reduce the spread of conspiracy theories, including adrenochrome-related content. Additionally, educational initiatives aim to improve media literacy and critical thinking skills among the public to better discern fact from fiction.

Ultimately, the adrenochrome conspiracy theory is a baseless narrative that has evolved from obscure references in literature and pseudoscience to a complex web of unfounded claims, intertwined with other conspiracy theories. It lacks any credible scientific support and has been debunked by experts across various fields.

The theory’s prevalence serves as a case study in the dynamics of misinformation and the psychological underpinnings of conspiracy belief systems. Efforts to combat its spread are crucial in maintaining a well-informed and rational public discourse.

Read more

Peter Navarro reports to prison

Former Trump advisor Peter Navarro — who wrote a book claiming credit for the idea to try and overthrow the 2020 election and bragged about it as the “Green Bay Sweep” to MSNBC’s Ari Melber — reported to prison today after the Supreme Court ruled he cannot get out of answering to a Congressional subpoena. Peter Navarro prison time is set to be 4 months for an independent jury’s conviction for Contempt of Congress.

The sentencing judge refuted Navarro’s allegations that he was the victim of a political prosecition: “you aren’t,” Mehta said. “You have received every process you are due.”

Read more

The backfire effect is a cognitive phenomenon that occurs when individuals are presented with information that contradicts their existing beliefs, leading them not only to reject the challenging information but also to further entrench themselves in their original beliefs.

This effect is counterintuitive, as one might expect that presenting factual information would correct misconceptions. However, due to various psychological mechanisms, the opposite can occur, complicating efforts to counter misinformation, disinformation, and the spread of conspiracy theories.

Origin and mechanism

The term “backfire effect” was popularized by researchers Brendan Nyhan and Jason Reifler, who in 2010 conducted studies demonstrating that corrections to false political information could actually deepen an individual’s commitment to their initial misconception. This effect is thought to stem from a combination of cognitive dissonance (the discomfort experienced when holding two conflicting beliefs) and identity-protective cognition (wherein individuals process information in a way that protects their sense of identity and group belonging).

Relation to media, disinformation, echo chambers, and media bubbles

In the context of media and disinformation, the backfire effect is particularly relevant. The proliferation of digital media platforms has made it easier than ever for individuals to encounter information that contradicts their beliefs — but paradoxically, it has also made it easier for them to insulate themselves in echo chambers and media bubbles—environments where their existing beliefs are constantly reinforced and rarely challenged.

Echo chambers refer to situations where individuals are exposed only to opinions and information that reinforce their existing beliefs, limiting their exposure to diverse perspectives. Media bubbles are similar, often facilitated by algorithms on social media platforms that curate content to match users’ interests and past behaviors, inadvertently reinforcing their existing beliefs and psychological biases.

Disinformation campaigns can exploit these dynamics by deliberately spreading misleading or false information, knowing that it is likely to be uncritically accepted and amplified within certain echo chambers or media bubbles. This can exacerbate the backfire effect, as attempts to correct the misinformation can lead to individuals further entrenching themselves in the false beliefs, especially if those beliefs are tied to their identity or worldview.

How the backfire effect happens

The backfire effect happens through a few key psychological processes:

  1. Cognitive Dissonance: When confronted with evidence that contradicts their beliefs, individuals experience discomfort. To alleviate this discomfort, they often reject the new information in favor of their pre-existing beliefs.
  2. Confirmation Bias: Individuals tend to favor information that confirms their existing beliefs and disregard information that contradicts them. This tendency towards bias can lead them to misinterpret or dismiss corrective information.
  3. Identity Defense: For many, beliefs are tied to their identity and social groups. Challenging these beliefs can feel like a personal attack, leading individuals to double down on their beliefs as a form of identity defense.

Prevention and mitigation

Preventing the backfire effect and its impact on public discourse and belief systems requires a multifaceted approach:

  1. Promote Media Literacy: Educating the public on how to critically evaluate sources and understand the mechanisms behind the spread of misinformation can empower individuals to think critically and assess the information they encounter.
  2. Encourage Exposure to Diverse Viewpoints: Breaking out of media bubbles and echo chambers by intentionally seeking out and engaging with a variety of perspectives can reduce the likelihood of the backfire effect by making conflicting information less threatening and more normal.
  3. Emphasize Shared Values: Framing challenging information in the context of shared values or goals can make it less threatening to an individual’s identity, reducing the defensive reaction.
  4. Use Fact-Checking and Corrections Carefully: Presenting corrections in a way that is non-confrontational and, when possible, aligns with the individual’s worldview or values can make the correction more acceptable. Visual aids and narratives that resonate with the individual’s experiences or beliefs can also be more effective than plain factual corrections.
  5. Foster Open Dialogue: Encouraging open, respectful conversations about contentious issues can help to humanize opposing viewpoints and reduce the instinctive defensive reactions to conflicting information.

The backfire effect presents a significant challenge in the fight against misinformation and disinformation, particularly in the context of digital media. Understanding the psychological underpinnings of this effect is crucial for developing strategies to promote a more informed and less polarized public discourse. By fostering critical thinking, encouraging exposure to diverse viewpoints, and promoting respectful dialogue, it may be possible to mitigate the impact of the backfire effect and create a healthier information ecosystem.

Read more

The “wallpaper effect” is a phenomenon in media, propaganda, and disinformation where individuals become influenced or even indoctrinated by being continuously exposed to a particular set of ideas, perspectives, or ideologies. This effect is akin to wallpaper in a room, which, though initially noticeable, becomes part of the unnoticed background over time.

The wallpaper effect plays a significant role in shaping public opinion and individual beliefs, often without the conscious awareness of the individuals affected.

Origins and mechanisms

The term “wallpaper effect” stems from the idea that constant exposure to a specific type of media or messaging can subconsciously influence an individual’s perception and beliefs, similar to how wallpaper in a room becomes a subtle but constant presence. This effect is potentiated by the human tendency to seek information that aligns with existing beliefs, known as confirmation bias. It leads to a situation where diverse viewpoints are overlooked, and a singular perspective dominates an individual’s information landscape.

The wallpaper effect, by DALL-E 3

Media and information bubbles

In the context of media, the wallpaper effect is exacerbated by the formation of information bubbles or echo chambers. These are environments where a person is exposed only to opinions and information that reinforce their existing beliefs.

The rise of digital media and personalized content algorithms has intensified this effect, as users often receive news and information tailored to their preferences, further entrenching their existing viewpoints. Even more insidiously, social media platforms tend to earn higher profits when they fill users’ feeds with ideological perspectives they already agree with. Even more profitable is the process of tilting them towards more extreme versions of those beliefs — a practice that in other contexts we call “radicalization.”

Role in propaganda and disinformation

The wallpaper effect is a critical tool in propaganda and disinformation campaigns. By consistently presenting a specific narrative or viewpoint, these campaigns can subtly alter the perceptions and beliefs of the target audience. Over time, the repeated exposure to these biased or false narratives becomes a backdrop to the individual’s understanding of events, issues, or groups, often leading to misconceptions or unwarranted biases.

Psychological impact

The psychological impact of the wallpaper effect is profound. It can lead to a narrowing of perspective, where individuals become less open to new information or alternative viewpoints. This effect can foster polarized communities and hyper partisan politics, where dialogue and understanding between differing viewpoints become increasingly difficult.

Case studies and examples

Historically, authoritarian regimes have used the wallpaper effect to control public opinion and suppress dissent. By monopolizing the media landscape and continuously broadcasting their propaganda, these regimes effectively shaped the public’s perception of reality.

In contemporary times, this effect is also seen in democracies, where partisan news outlets or social media algorithms create a similar, though more fragmented, landscape of information bubbles.

Counteracting the wallpaper effect

Counteracting the wallpaper effect involves a multifaceted approach. Media literacy education is crucial, as it empowers individuals to critically analyze and understand the sources and content of information they consume.

Encouraging exposure to a wide range of viewpoints and promoting critical thinking skills are also essential strategies. Additionally, reforms in digital media algorithms to promote diverse viewpoints and reduce the creation of echo chambers can help mitigate this effect.

Implications for democracy and society

The wallpaper effect has significant implications for democracy and society. It can lead to a polarized public, where consensus and compromise become challenging to achieve. The narrowing of perspective and entrenchment of beliefs can undermine democratic discourse, leading to increased societal divisions and decreased trust in media and institutions.

The wallpaper effect is a critical phenomenon that shapes public opinion and belief systems. Its influence is subtle yet profound, as constant exposure to a specific set of ideas can subconsciously mold an individual’s worldview. Understanding and addressing this effect is essential in promoting a healthy, informed, and open society. Efforts to enhance media literacy, promote diverse viewpoints, and reform digital media practices are key to mitigating the wallpaper effect and fostering a more informed and less polarized public.

Read more

Election denialism, the refusal to accept credible election outcomes, has significantly impacted U.S. history, especially in recent years. This phenomenon is not entirely new; election denial has roots that stretch back through various periods of American history. However, its prevalence and intensity have surged in the contemporary digital and political landscape, influencing public trust, political discourse, and the very fabric of democracy.

Historical context

Historically, disputes over election outcomes are as old as the U.S. electoral system itself. For instance, the fiercely contested 1800 election between Thomas Jefferson and John Adams resulted in a constitutional amendment (the 12th Amendment) to prevent similar confusion in the future. The 1876 election between Rutherford B. Hayes and Samuel J. Tilden was resolved through the Compromise of 1877, which effectively ended Reconstruction and had profound effects on the Southern United States.

Yet these instances, while contentious, were resolved within the framework of existing legal and political mechanisms, without denying the legitimacy of the electoral process itself. Over time, claims of election fraud would come to be levied against the electoral and political system itself — with dangerous implications for the peaceful transfer of power upon which democracy rests.

Voting box in an election, by Midjourney

The 21st century and digital influence

Fast forward to the 21st century, and election denialism has taken on new dimensions, fueled by the rapid dissemination of disinformation (and misinformation) through digital media and a polarized political climate. The 2000 Presidential election, with its razor-thin margins and weeks of legal battles over Florida’s vote count, tested the country’s faith in the electoral process.

Although the Supreme Court‘s decision in Bush v. Gore was deeply controversial, Al Gore’s concession helped to maintain the American tradition of peaceful transitions of power.

The 2020 Election: A flashpoint

The 2020 election, marked by the COVID-19 pandemic and an unprecedented number of mail-in ballots, became a flashpoint for election denialism. Claims of widespread voter fraud and electoral malfeasance were propagated at the highest levels of government, despite a lack of evidence substantiated by multiple recounts, audits, and legal proceedings across several states.

The refusal to concede by President Trump and the storming of the U.S. Capitol on January 6, 2021, marked a watershed moment in U.S. history, where election denialism moved from the fringes to the center of political discourse, challenging the norms of democratic transition. Widely referred to as The Big Lie, the baseless claims of election fraud that persist in the right-wing to this day are considered themselves to be a form of election fraud by justice officials, legal analysts, and a host of concerned citizens worried about ongoing attempts to overthrow democracy in the United States.

Implications, public trust, and voter suppression

The implications of this recent surge in election denialism are far-reaching. It has eroded public trust in the electoral system, with polls indicating a significant portion of the American populace doubting the legitimacy of election results. This skepticism is not limited to the national level but has trickled down to local elections, with election officials facing threats and harassment. The spread of misinformation, propaganda, and conspiracy theories about electoral processes and outcomes has become a tool for political mobilization, often exacerbating divisions within the American society.

Moreover, election denialism has prompted legislative responses at the state level, with numerous bills introduced to restrict voting access in the name of election security. These measures have sparked debates about voter suppression and the balance between securing elections and ensuring broad electoral participation. The challenge lies in addressing legitimate concerns about election integrity while avoiding the disenfranchisement of eligible voters.

Calls for reform and strengthening democracy

In response to these challenges, there have been calls for reforms to strengthen the resilience of the U.S. electoral system. These include measures to enhance the security and transparency of the voting process, improve the accuracy of voter rolls, and counter misinformation about elections. There’s also a growing emphasis on civic education to foster a more informed electorate capable of critically evaluating electoral information.

The rise of election denialism in recent years highlights the fragility of democratic norms and the crucial role of trust in the electoral process. While disputes over election outcomes are not new, the scale and impact of recent episodes pose unique challenges to American democracy. Addressing these challenges requires a multifaceted approach, including legal, educational, and technological interventions, to reinforce the foundations of democratic governance and ensure that the will of the people is accurately and fairly represented.

Read more

The term “hoax” is derived from “hocus,” a term that has been in use since the late 18th century. It originally referred to a trick or deception, often of a playful or harmless nature. The essence of a hoax was its capacity to deceive, typically for entertainment or to prove a point without malicious intent. Over time, the scope and implications of a hoax have broadened significantly. What was once a term denoting jest or trickery has morphed into a label for deliberate falsehoods intended to mislead or manipulate public perception.

From playful deception to malicious misinformation

As society entered the age of mass communication, the potential reach and impact of hoaxes expanded dramatically. The advent of newspapers, radio, television, and eventually the internet and social media platforms, transformed the way information—and misinformation—circulated. Hoaxes began to be used not just for amusement but for more nefarious purposes, including political manipulation, financial fraud, and social engineering. The line between a harmless prank and damaging disinformation and misinformation became increasingly blurred.

The political weaponization of “hoax”

In the contemporary political landscape, particularly within the US, the term “hoax” has been co-opted as a tool for disinformation and propaganda. This strategic appropriation has been most visible among certain factions of the right-wing, where it is used to discredit damaging information, undermine factual reporting, and challenge the legitimacy of institutional findings or scientific consensus. This application of “hoax” serves multiple purposes: it seeks to sow doubt, rally political bases, and divert attention from substantive issues.

the politicization of hoaxes, via fake scandals that tie up the media unwittingly in bullshit for years, by DALL-E 3

This tactic involves labeling genuine concerns, credible investigations, and verified facts as “hoaxes” to delegitimize opponents and minimize the impact of damaging revelations. It is a form of gaslighting on a mass scale, where the goal is not just to deny wrongdoing but to erode the very foundations of truth and consensus. By branding something as a “hoax,” these actors attempt to preemptively dismiss any criticism or negative information, regardless of its veracity.

Case Studies: The “Hoax” label in action

High-profile instances of this strategy include the dismissal of climate change data, the denial of election results, and the rejection of public health advice during the COVID-19 pandemic. In each case, the term “hoax” has been employed not as a description of a specific act of deception, but as a blanket term intended to cast doubt on the legitimacy of scientifically or empirically supported conclusions. This usage represents a significant departure from the term’s origins, emphasizing denial and division over dialogue and discovery.

The impact on public discourse and trust

The strategic labeling of inconvenient truths as “hoaxes” has profound implications for public discourse and trust in institutions. It creates an environment where facts are fungible, and truth is contingent on political allegiance rather than empirical evidence. This erosion of shared reality undermines democratic processes, hampers effective governance, and polarizes society.

Moreover, the frequent use of “hoax” in political discourse dilutes the term’s meaning and impact, making it more difficult to identify and respond to genuine instances of deception. When everything can be dismissed as a hoax, the capacity for critical engagement and informed decision-making is significantly compromised.

Moving Forward: Navigating a “post-hoax” landscape

The challenge moving forward is to reclaim the narrative space that has been distorted by the misuse of “hoax” and similar terms. This involves promoting media literacy, encouraging critical thinking, and fostering a public culture that values truth and accountability over partisanship. It also requires the media, educators, and public figures to be vigilant in their language, carefully distinguishing between genuine skepticism and disingenuous dismissal.

The evolution of “hoax” from a term denoting playful deception to a tool for political disinformation reflects broader shifts in how information, truth, and reality are contested in the public sphere. Understanding this transformation is crucial for navigating the complexities of the modern informational landscape and for fostering a more informed, resilient, and cohesive society.

Read more

Wealth Cult -- rich men behaving badly, by Midjourney

A network of exceedingly wealthy individuals and organizations have channeled their vast fortunes into influencing American politics, policy, and public opinion — they’ve formed a wealth cult. And they’ve leveraged that cult and its considerable fortune to influence and in many ways dramatically transform American politics.

The term “dark money” refers to political spending meant to influence the decision-making and critical thinking of the public and lawmakers where the source of the money is not disclosed. This lack of transparency makes it challenging to trace the influence back to its origins, hence the term “dark.”

And, it is dark indeed.

Wealth cult anchors the trench coat

The Wealth Cult is one of 3 primary groups or clusters supporting the right-wing and generally, the Republican Party. It anchors the trench coat by funding the 2 cults above it: the Christian Cult, and the White Cult.

Its story is stealthy and significant.

A bunch of billionaires toast themselves to themselves, by Midjourney

The wealth cult has funded disinformation campaigns, the spread of conspiracy theories, created fake social movements through astroturfing, enabled violent extremists to attack their country’s capitol, cruelly deprived vulnerable people (especially immigrants, poor people, and women) of the kind of state aid granted generously throughout the developed world, bribed regulators, rigged elections, crashed economies, and on and on in service of their extremist free market ideology beliefs.

They believe in “makers and takers,” or Mudsill Theory, as it was once called by pedophile and racist Senator and slavery enthusiast James Henry Hammond. Some people were born to serve others, they say. Hierarchies are natural, they claim. Wealthy men should make all the decisions — because that’s what’s best for everyone, they say in paternalistic tones.

I don’t buy it. I believe all men are created equal. So did a certain Founder of our country.

Continue reading Wealth Cult: The oligarchs influencing American politics from the shadows
Read more

An echo chamber is a metaphorical description of a situation where an individual is encased in a bubble of like-minded information, reinforcing pre-existing views without exposure to opposing perspectives. This concept has gained prominence with the rise of digital and social media, where algorithms personalize user experiences, inadvertently isolating individuals from diverse viewpoints and enabling people to remain cloistered within a closed system that may contain misinformation and disinformation.

The role of digital media and algorithms

Digital platforms and social media leverage algorithms to tailor content that aligns with users’ past behaviors and preferences. This personalization, while enhancing engagement, fosters filter bubbles—closed environments laden with homogeneous information.

Such settings are ripe for the unchecked proliferation of disinformation, as they lack the diversity of opinion necessary for critical scrutiny. The need for critical thinking is greatly diminished when we are only ever exposed to information and beliefs we already agree with.

Disinformation in echo chambers

Echo chambers serve as breeding grounds for disinformation, where false information is designed to mislead and manipulate. In these closed loops, disinformation finds little resistance and is readily accepted and amplified, bolstering existing biases and misconceptions.

We all have psychological traits that make us vulnerable to believing things that aren’t true. Whether sourced via deception, misinterpretation, conspiracy theories, propaganda, or other phenomena, false beliefs are made stickier and harder to debunk when one is surrounded by an echo chamber.

Political polarization exacerbated

Beyond the scale of lone individuals, the isolation facilitated by echo chambers significantly contributes to political polarization more broadly. As people become entrenched in their informational silos, the common ground necessary for democratic discourse dwindles. This division not only fosters extremism but also undermines the social cohesion essential for a functioning democracy.

The impact of confirmation bias

Within echo chambers, confirmation bias—the tendency to favor information that corroborates existing beliefs—becomes particularly pronounced. This cognitive bias solidifies ideological positions, making individuals resistant to changing their views, even in the face of contradictory evidence.

The real-world effects of echo chambers transcend digital boundaries as well, influencing real-world political landscapes. Political actors can exploit these dynamics to deepen divides, manipulate public opinion, and mobilize support based on misinformation, leading to a polarized and potentially radicalized electorate.

Strategies for mitigation

Combating the challenges posed by echo chambers and disinformation necessitates a comprehensive approach:

  • Media Literacy: Educating the public to critically assess information sources, understand content personalization, and identify sources of biases and disinformation.
  • Responsible Platform Design: Encouraging digital platforms to modify algorithms to promote diversity in content exposure and implement measures against disinformation.
  • Regulatory Interventions: Policymakers may need to step in to ensure digital environments foster healthy public discourse.

Echo chambers, particularly within the digital media landscape, significantly impact the spread of disinformation and political polarization. By reinforcing existing beliefs and isolating individuals from diverse perspectives, they contribute to a divided society. Addressing this issue is critical and requires efforts in education, platform design, and regulation to promote a more informed and cohesive public discourse.

Read more

The phenomenon of anti-vaccination disinformation, often referred to as the “anti-vax” movement, is a complex and multifaceted issue that has evolved over time, particularly in the United States. It intersects with public health, misinformation, societal trust, and cultural dynamics — to name a few.

History and evolution in the U.S.

The roots of anti-vaccination sentiment in the U.S. can be traced back to the 19th century. Initially, it was based on religious and philosophical grounds, with some opposition to the smallpox vaccine. However, the contemporary form of the anti-vax movement gained momentum in the late 20th and early 21st centuries.

A significant turning point was a 1998 study published by Andrew Wakefield, which falsely linked the MMR vaccine (measles, mumps, and rubella) to autism. Despite being debunked and retracted, this study sowed seeds of doubt about vaccine safety.

a vaccine needle, by Midjourney

Key proponents and spreaders of disinformation

The modern anti-vax movement is characterized by its diversity, ranging from fringe conspiracy theorists to wellness influencers and some celebrities. The internet and social media have been crucial in disseminating anti-vaccine misinformation.

Websites, forums, and social media platforms have allowed the rapid spread of false claims, often amplified by algorithms that favor sensational content — because that’s what keeps people consuming content on the sites. It’s part of a larger process of radicalization that social media can contribute to.

Impact on society and sulture

The impact of anti-vaccination disinformation is profound and multifaceted:

  1. Public Health: It poses a significant threat to public health. Reduced vaccination rates can lead to outbreaks of preventable diseases, as seen with the resurgence of measles in recent years, as well as the refusal to get vaccinated to prevent the spread of covid-19.
  2. Trust in Science and Institutions: It erodes trust in medical science, healthcare professionals, and public health institutions. This skepticism extends beyond vaccines, impacting broader public health measures and leading to an increasing science denialism in culture more generally.
  3. Social Polarization: It contributes to social, cultural, and political polarization. Vaccination status has become a contentious issue, often intertwined with political and ideological beliefs.
  4. Economic Impact: There are also economic implications, as disease outbreaks require significant resources to manage and can disrupt communities and businesses.

Combatting anti-vaccination disinformation

Addressing anti-vaccination disinformation requires a multi-pronged approach:

  1. Promoting Accurate Information: Healthcare professionals, scientists, and public health officials need to proactively disseminate accurate, easy-to-understand information about vaccines. This includes addressing common misconceptions and providing transparent information about vaccine development, safety, and efficacy.
  2. Engaging with Concerns: It’s essential to engage respectfully with individuals who have concerns about vaccines. Many people who hesitate are not staunchly anti-vaccine but may have genuine questions or fears that need addressing.
  3. Media Literacy and Critical Thinking: Promoting media literacy and critical thinking skills can help individuals discern reliable information from misinformation.
  4. Policy and Regulation: There’s a role for policy and regulation in addressing misinformation on social media and other platforms. This includes holding platforms accountable for the spread of false information and considering policies around vaccine requirements for certain activities or institutions.
  5. Community Engagement: Leveraging community leaders, including religious and cultural figures, can be effective in promoting vaccination, particularly in communities that are distrustful of government or mainstream healthcare.
  6. Global Perspective: Finally, recognizing that this is a global issue, international cooperation and support are essential, especially in countering misinformation in low and middle-income countries.
virus, by Midjourney

Combating anti-vaccination disinformation is a complex task that requires a nuanced understanding of its historical roots, the mechanisms of its spread, and its societal impacts. Efforts must be multidisciplinary, involving healthcare professionals, educators, policy makers, and community leaders.

The ultimate goal is to foster an environment where informed decisions about vaccinations are made based on credible information, thus protecting public health and societal well-being. To that end, we’ve got a long way to go.

Read more