Politics

A “meme” is a term first coined by British evolutionary biologist Richard Dawkins in his 1976 book “The Selfish Gene.” Originally, it referred to an idea, behavior, or style that spreads from person to person within a culture. However, in the digital age, the term has evolved to specifically denote a type of media – often an image with text, but sometimes a video or a hashtag – that spreads rapidly online, typically through social media platforms like Facebook, Twitter/X, Reddit, TikTok, and generally all extant platforms.

Memes on the digital savannah

In the context of the internet, memes are a form of digital content that encapsulates a concept, joke, or sentiment in a highly relatable and easily shareable format. They often consist of a recognizable image or video, overlaid with humorous or poignant text that pertains to current events, popular culture, or universal human experiences. Memes have become a cornerstone of online communication, offering a way for individuals to express opinions, share laughs, and comment on societal norms.

Grumpy Cat meme: "There are two types of people in this world... and I hate them"

Once primarily a tool of whimsy, amusement, and even uplifit, in recent years memes have become far more weaponized by trolls and bad actors as part of a broader shift in internet culture towards incivility and exploitation. The days of funny cats have been encroached upon by the racism and antisemitism of Pepe the Frog, beloved patron saint meme of the alt-right. The use of memes to project cynicism or thinly-veiled white supremacy into culture and politics is an unwelcome trend that throws cold water on the formerly more innocent days of meme yore online.

Memes as tools of disinformation and information warfare

While memes are still used for entertainment and social commentary, they have also become potent tools for disseminating disinformation and conducting information warfare, both domestically and abroad. This is particularly evident in political arenas where, for instance, American right-wing groups have leveraged memes to spread their ideologies, influence public opinion, and discredit opposition.

  1. Simplicity and Virality: Memes are easy to create and consume, making them highly viral. This simplicity allows for complex ideas to be condensed into easily digestible and shareable content, often bypassing critical analysis from viewers.
  2. Anonymity and Plausible Deniability: The often-anonymous nature of meme creation and sharing allows individuals or groups to spread disinformation without accountability. The humorous or satirical guise of memes also provides a shield of plausible deniability against accusations of spreading falsehoods.
  3. Emotional Appeal: Memes often evoke strong emotional responses, which can be more effective in influencing public opinion than presenting factual information. The American right-wing, among other groups, has adeptly used memes to evoke feelings of pride, anger, or fear, aligning such emotions with their political messages.
  4. Echo Chambers and Confirmation Bias: Social media algorithms tend to show users content that aligns with their existing beliefs, creating echo chambers. Memes that reinforce these beliefs are more likely to be shared within these circles, further entrenching ideologies and sometimes spreading misinformation.
  5. Manipulation of Public Discourse: Memes can be used to distract from important issues, mock political opponents, or oversimplify complex social and political problems. This can skew public discourse and divert attention from substantive policy discussions or critical events.
  6. Targeting the Undecided: Memes can be particularly effective in influencing individuals who are undecided or less politically engaged. Their simplicity and humor can be more appealing than traditional forms of political communication, making them a powerful tool for shaping opinions.

Memes in political campaigns

Memes have been used to discredit candidates or push particular narratives that favor right-wing ideologies. Memes have also been employed to foster distrust in mainstream media and institutions, promoting alternative, often unfounded narratives that align with right-wing agendas.

Trump QAnon meme: "The Storm is Coming" in Game of Thrones font, shared on Truth Social

While often benign and humorous, memes can also be wielded as powerful tools of disinformation and information warfare. The American right-wing, among other political groups globally, has harnessed the viral nature of memes to influence public opinion, manipulate discourse, and spread their ideologies. As digital media continues to evolve, the role of memes in political and social spheres is likely to grow, making it crucial for consumers to approach them with a critical eye.

Read more

Cyberbullying involves the use of digital technologies, like social media, texting, and websites, to harass, intimidate, or embarrass individuals. Unlike traditional bullying, its digital nature allows for anonymity and a wider audience. Cyberbullies employ various tactics such as sending threatening messages, spreading rumors online, posting sensitive or derogatory information, or impersonating someone to damage their reputation — on to more sinister and dangerous actions like doxxing.

Geopolitical impact of cyberbullying

In recent years, cyberbullying has transcended personal boundaries and infiltrated the realm of geopolitics. Nation-states or politically motivated groups have started using cyberbullying tactics to intimidate dissidents, manipulate public opinion, or disrupt political processes in other countries. Examples include spreading disinformation, launching smear campaigns against political figures, or using bots to amplify divisive content. This form of cyberbullying can have significant consequences, destabilizing societies and influencing elections.

Recognizing cyberbullying

Identifying cyberbullying involves looking for signs of digital harassment. This can include receiving repeated, unsolicited, and aggressive communications, noticing fake profiles spreading misinformation about an individual, or observing coordinated attacks against a person or group. In geopolitics, recognizing cyberbullying might involve identifying patterns of disinformation, noting unusual social media activity around sensitive political topics, or detecting state-sponsored troll accounts.

Responding to cyberbullying

The response to cyberbullying varies based on the context and severity. For individuals, it involves:

  1. Documentation: Keep records of all bullying messages or posts.
  2. Non-engagement: Avoid responding to the bully, as engagement often escalates the situation.
  3. Reporting: Report the behavior to the platform where it occurred and, if necessary, to law enforcement.
  4. Seeking Support: Reach out to friends, family, or professionals for emotional support.

For geopolitical cyberbullying, responses are more complex and involve:

  1. Public Awareness: Educate the public about the signs of state-sponsored cyberbullying and disinformation.
  2. Policy and Diplomacy: Governments can implement policies to counteract foreign cyberbullying and engage in diplomatic efforts to address these issues internationally.
  3. Cybersecurity Measures: Strengthening cybersecurity infrastructures to prevent and respond to cyberbullying at a state level.

Cyberbullying, in its personal and geopolitical forms, represents a significant challenge in the digital age. Understanding its nature, recognizing its signs, and knowing how to respond are crucial steps in mitigating its impact. For individuals, it means being vigilant online and knowing when to seek help. In the geopolitical arena, it requires a coordinated effort from governments, tech companies, and the public to defend against these insidious forms of digital aggression. By taking these steps, societies can work towards a safer, more respectful digital world.

Read more

The “repetition effect” is a potent psychological phenomenon and a common propaganda device. This technique operates on the principle that repeated exposure to a specific message or idea increases the likelihood of its acceptance as truth or normalcy by an individual or the public. Its effectiveness lies in its simplicity and its exploitation of a basic human cognitive bias: the more we hear something, the more likely we are to believe it.

Repetition effect, by Midjourney

Historical context

The repetition effect has been used throughout history, but its most notorious use was by Adolf Hitler and the Nazi Party in Germany. Hitler, along with his Propaganda Minister, Joseph Goebbels, effectively employed this technique to disseminate Nazi ideology and promote antisemitism. In his autobiography “Mein Kampf,” Hitler wrote about the importance of repetition in reinforcing the message and ensuring that it reached the widest possible audience. He believed that the constant repetition of a lie would eventually be accepted as truth.

Goebbels echoed this sentiment, famously stating, “If you tell a lie big enough and keep repeating it, people will eventually come to believe it.” The Nazi regime used this strategy in various forms, including in speeches, posters, films, and through controlled media. The relentless repetition of anti-Semitic propaganda, the glorification of the Aryan race, and the demonization of enemies played a crucial role in the establishment and maintenance of the Nazi regime.

Psychological basis

The effectiveness of the repetition effect is rooted in cognitive psychology. This bias is known as the “illusory truth effect,” where repeated exposure to a statement increases its perceived truthfulness. The phenomenon is tied to the ease with which familiar information is processed. When we hear something repeatedly, it becomes more fluent to process, and our brains misinterpret this fluency as a signal for truth.

Modern era usage

The transition into the modern era saw the repetition effect adapting to new media and communication technologies. In the age of television and radio, political figures and advertisers used repetition to embed messages in the public consciousness. The rise of the internet and social media has further amplified the impact of this technique. In the digital age, the speed and reach of information are unprecedented, making it easier for false information to be spread and for the repetition effect to be exploited on a global scale.

The repetition effect on screens and social media, by Midjourney

Political campaigns, especially in polarized environments, often use the repetition effect to reinforce their messages. The constant repetition of slogans, talking points, and specific narratives across various platforms solidifies these messages in the public’s mind, regardless of their factual accuracy.

Ethical considerations and countermeasures

The ethical implications of using the repetition effect are significant, especially when it involves spreading disinformation or harmful ideologies. It raises concerns about the manipulation of public opinion and the undermining of democratic processes.

To counteract the repetition effect, media literacy and critical thinking are essential. Educating the public about this psychological bias and encouraging skepticism towards repeated messages can help mitigate its influence. Fact-checking and the promotion of diverse sources of information also play a critical role in combating the spread of falsehoods reinforced by repetition.

Repetition effect: A key tool of propaganda

The repetition effect is a powerful psychological tool in the arsenal of propagandists and communicators. From its historical use by Hitler and the fascists to its continued relevance in the digital era, this technique demonstrates the profound impact of repeated messaging on public perception and belief.

While it can be used for benign purposes, such as in advertising or reinforcing positive social behaviors, its potential for manipulation and spreading misinformation cannot be understated. Understanding and recognizing the repetition effect is crucial in developing a more discerning and informed approach to the information we encounter daily.

Read more

The term “alternative facts” gained widespread attention on January 22, 2017, when Kellyanne Conway, then-Counselor to President Donald Trump, used it during a “Meet the Press” interview. Conway was defending White House Press Secretary Sean Spicer’s false statements about the attendance numbers at Trump’s presidential inauguration the day before.

When challenged by the interviewer, who cited several facts indicating a much smaller crowd size relative to President Obama‘s inauguration, Conway asserted that Spicer was offering “alternative facts” to the media reports, which suggested a lower attendance compared to previous inaugurations.

Kellyanne Conway, by Midjourney

Philosophical and Historical Context

The term, while new in its specific phrasing, taps into a long-standing philosophical debate about truth and reality. Historically, the idea that there can be different interpretations of facts has roots in relativism and constructivism.

However, the way “alternative facts” was used implied a more radical departure from the accepted notion of objective facts, tilting towards a post-truth era where the line between truth and falsehood becomes blurred. It indicated an intentional strategy of disseminating disinformation early on in the Trump administration, and articulated it out loud in a way that previous presidents had never done before.

Use in US politics

The use of “alternative facts” in US politics has been controversial and highly debated. Proponents argue that the term simply reflects different perspectives and interpretations of events. Critics, however, see it as an attempt to legitimize falsehoods or misleading information, particularly when used by those in power to contradict evidence and well-established facts.

The term quickly became symbolic of the Trump administration’s relationship with the media and its approach to information dissemination. It was seen as part of a broader strategy that involved discrediting mainstream media as so-called “fake news,” promoting favorable narratives, and challenging the notion of objective truth. It extended the already prevalent right-wing strategy of science denialism into a kind of denialism of reality itself — a dangerous path towards authoritarianism reminiscent of the use of Newspeak in George Orwell’s famous classic dystopian novel, 1984.

Donald Trump spewing "Alternative Facts" into the disinformation ecosystem, by Midjourney

Implications for American democracy

The implications of the widespread use of “alternative facts” for American democracy are profound and multifaceted:

  1. Erosion of Trust: The concept challenges the role of a free press and fact-checking institutions in democracy. When official statements are at odds with verifiable evidence, it erodes public trust in both the government and the media.
  2. Polarization: It exacerbates political polarization. When people cannot agree on basic facts, finding common ground becomes challenging, leading to a more divided society.
  3. Manipulation and Propaganda: The term can be weaponized for political ends, allowing for manipulation of public opinion and spreading propaganda.
  4. Accountability and Governance: In a democracy, accountability is key. If leaders are seen to use “alternative facts” without consequence, it undermines democratic governance and the expectation that leaders are truthful and transparent.
  5. Public Discourse and Decision Making: Accurate information is crucial for informed decision making by the electorate. When false information is disseminated under the guise of “alternative facts,” it impairs the public’s ability to make informed decisions.
  6. Legal and Ethical Concerns: The concept raises ethical concerns about honesty and integrity in public office and can complicate legal proceedings when factual accuracy is disputed.

The dangers of “reality denial”alternative facts” in political discourse

“Alternative facts,” as a term and a concept, represents more than just a linguistic novelty; it signifies a shift in the landscape of political discourse and the relationship between truth, power, and democracy. Its emergence and use reflect deeper tensions in society about trust, media, and the nature of reality itself. For American democracy, grappling with the implications of this term is not just an intellectual exercise but a necessary endeavor to preserve the integrity of our democratic institutions and public discourse.

It’s one thing to have legitimately different perspectives on the issues. It’s quite another to throw out the founding ideals and Enlightenment principles of rational inquiry, scientific observation, and reality testing altogether. If we cannot agree even on the basic facts of a situation, the ability to arrive at any kind of policy consensus about what to do to solve issues and problems in society that will always occur is deeply impaired — and indeed, our democracy is placed in great peril.

We must recommit fully to the finding of Actual Facts — and put behind us the childish nursing of our favored Alternative Facts.

Read more

Shitposting, a term that has seeped into the mainstream of internet culture, is often characterized by the act of posting deliberately provocative, off-topic, or nonsensical content in online communities and on social media. The somewhat vulgar term encapsulates a spectrum of online behavior ranging from harmless, humorous banter to malicious, divisive content.

Typically, a shit-post is defined by its lack of substantive content, its primary goal being to elicit attention and reactions — whether amusement, confusion, or irritation — from its intended audience. Closely related to trolling, shitposting is one aspect of a broader pantheon of bad faith behavior online.

Shit-poster motivations

The demographic engaging in shit-posting is diverse, cutting across various age groups, social strata, and political affiliations. However, it’s particularly prevalent among younger internet users who are well-versed in meme culture and online vernacular. The motivations for shit-posting can be as varied as its practitioners.

Some engage in it for humor and entertainment, seeing it as a form of digital performance art. Others may use it as a tool for social commentary or satire, while a more nefarious subset might employ it to spread disinformation and misinformation, sow discord, and/or harass individuals or groups.

Online trolls shitposting on the internet, by Midjourney

Context in US politics

In the realm of U.S. politics, shit-posting has assumed a significant role in recent elections, especially on platforms like Twitter / X, Reddit, and Facebook. Politicians, activists, and politically engaged individuals often use this tactic to galvanize supporters, mock opponents, or shape public perception. It’s not uncommon to see political shit-posts that are laden with irony, exaggeration, or out-of-context information, designed to inflame passions or reinforce existing biases — or exploit them.

Recognition and response

Recognizing shit-posting involves a discerning eye. Key indicators include the use of hyperbole, irony, non-sequiturs, and content that seems outlandishly out of place or context. The tone is often mocking or sarcastic. Visual cues, such as memes or exaggerated images, are common.

Responding to shit-posting is a nuanced affair. Engaging with it can sometimes amplify the message, which might be the poster’s intention. A measured approach is to assess the intent behind the post. If it’s harmless humor, it might warrant a light-hearted response or none at all.

For posts that are disinformation or border on misinformation or toxicity, countering with factual information, reporting the content, or choosing not to engage are viable strategies. The key is not to feed into the cycle of provocation and reaction that shit-posting often seeks to perpetuate.

Shitposting troll farms lurk in the shadows, beaming disinformation across the land -- by Midjourney

Fighting back

Shit-posting, in its many forms, is a complex phenomenon in the digital age. It straddles the line between being a form of modern-day satire and a tool for misinformation, propaganda, and/or cyberbullying. As digital communication continues to evolve, understanding the nuances of shit-posting – its forms, motivations, and impacts – becomes crucial, particularly in politically charged environments. Navigating this landscape requires a balanced approach, blending awareness, discernment, and thoughtful engagement.

This overview provides a basic understanding of shit-posting, but the landscape is ever-changing, with new forms and norms continually emerging. The ongoing evolution of online communication norms, including phenomena like shit-posting, is particularly fascinating and significant in the broader context of digital culture and political discourse.

Read more

Science denialism has a complex and multifaceted history, notably marked by a significant event in 1953 that set a precedent for the tactics of disinformation widely observed in various spheres today, including politics.

The 1953 meeting and the birth of the disinformation playbook

The origins of modern science denial can be traced back to a pivotal meeting in December 1953, involving the heads of the four largest American tobacco companies. This meeting was a response to emerging scientific research linking smoking to lung cancer — a serious existenstial threat to their business model.

Concerned about the potential impact on their business, these industry leaders collaborated with a public relations firm, Hill & Knowlton, to craft a strategy. This strategy was designed not only to dispute the growing evidence about the health risks of smoking, but also to manipulate public perception by creating doubt about the science itself. They created the Tobacco Industry Research Committee (TIRC) as an organization to cast doubt on the established science, and prevent the public from knowing about the lethal dangers of smoking.

And it worked — for over 40 years. The public never formed a consensus on the lethality and addictiveness of nicotine until well into the 1990s, when the jig was finally up and Big Tobacco had to pay a record-breaking $200 billion settlement over their 4 decades of mercilessly lying to the American people following the Tobacco Master Settlement Agreement (MSA) of 1998.

smoking and the disinformation campaign of Big Tobacco leading to science denialism, by Midjourney

Strategies of the disinformation playbook

This approach laid the groundwork for what is often referred to as the “disinformation playbook.” The key elements of this playbook include creating doubt about scientific consensus, funding research that could contradict or cloud scientific understanding, using think tanks or other organizations to promote these alternative narratives, and influencing media and public opinion to maintain policy and regulatory environments favorable to their interests — whether profit, power, or both.

Over the next 7 decades — up to the present day — this disinformation playbook has been used by powerful special interests to cast doubt, despite scientific consensus, on acid rain, depletion of the ozone layer, the viability of Ronald Reagan‘s Strategic Defense Initiative (SDI), and perhaps most notably: the man-made causes of climate change.

Adoption and adaptation in various industries

The tobacco industry’s tactics were alarmingly successful for decades, delaying effective regulation and public awareness of smoking’s health risks. These strategies were later adopted and adapted by various industries and groups facing similar scientific challenges to their products or ideologies. For instance, the fossil fuel industry used similar tactics to cast doubt on global warming — leading to the phenomenon of climate change denialism. Chemical manufacturers have disputed science on the harmful effects of certain chemicals like DDT and BPA.

What began as a PR exercise by Big Tobacco to preserve their fantastic profits once science discovered the deleterious health effects of smoking eventually evolved into a strategy of fomenting science denialism more broadly. Why discredit one single finding of the scientific community when you could cast doubt on the entire process of science itself — as a way of future-proofing any government regulation that might curtail your business interests?

Science denial in modern politics

In recent years, the tactics of science denial have become increasingly prevalent in politics. Political actors, often influenced by corporate interests or ideological agendas, have employed these strategies to challenge scientific findings that are politically inconvenient — despite strong and often overwhelming evidence. This is evident in manufactured “debates” on climate change, vaccine safety, and COVID-19, where scientific consensus is often contested not based on new scientific evidence but through disinformation strategies aimed at sowing doubt and confusion.

The role of digital media and politicization

The rise of social media has accelerated the spread of science denial. The digital landscape allows for rapid dissemination of misinformation and the formation of echo chambers, where groups can reinforce shared beliefs or skepticism, often insulated from corrective or opposing information. Additionally, the politicization of science, where scientific findings are viewed through the lens of political allegiance rather than objective evidence, has further entrenched science denial in modern discourse — as just one aspect of the seeming politicization of absolutely everything in modern life and culture.

Strategies for combatting science denial

The ongoing impact of science denial is profound. It undermines public understanding of science, hampers informed decision-making, and delays action on critical issues like climate change, public health, and environmental protection. The spread of misinformation about vaccines, for instance, has led to a decrease in vaccination rates and a resurgence of diseases like measles.

scientific literacy, by Midjourney

To combat science denial, experts suggest several strategies. Promoting scientific literacy and critical thinking skills among the general public is crucial. This involves not just understanding scientific facts, but also developing an understanding of the scientific method and how scientific knowledge is developed and validated. Engaging in open, transparent communication about science, including the discussion of uncertainties and limitations of current knowledge, can also help build public trust in science.

Science denial, rooted in the strategies developed by the tobacco industry in the 1950s, has evolved into a significant challenge in contemporary society, impacting not just public health and environmental policy but also the very nature of public discourse and trust in science. Addressing this issue requires a multifaceted approach, including education, transparent communication, and collaborative efforts to uphold the integrity of scientific information.

Read more

Climate Change Denial: From Big Tobacco Tactics to Today’s Global Challenge

In the complex narrative of global climate change, one pervasive thread is the phenomenon of climate change denial. This denial isn’t just a refusal to accept the scientific findings around climate change; it is a systematic effort to discredit and cast doubt on environmental realities and the need for urgent action.

Remarkably, the roots of this denial can be traced back to the strategies used by the tobacco industry in the mid-20th century to obfuscate the link between smoking and lung cancer. This companies conspired to create a disinformation campaign against the growing scientific consensus on the manmade nature of climate change, to cast doubt about the link between the burning of fossil fuels and the destruction of the planet’s natural ecosystems — and they succeeded, for over half a century, beginning in 1953.

climate change and its denial, by Midjourney

Origins in big tobacco’s playbook

The origins of climate change denial lie in a well-oiled, public relations machine initially designed by the tobacco industry. When scientific studies began linking smoking to lung cancer in the 1950s, tobacco companies launched an extensive campaign to challenge these findings. Their strategy was not to disprove the science outright but to sow seeds of doubt, suggesting that the research was not conclusive and that more studies were needed. This strategy of manufacturing doubt proved effective in delaying regulatory and public action against tobacco products, for more than 5 decades.

Adoption by climate change deniers

This playbook was later adopted by those seeking to undermine climate science. In the late 20th century, as scientific consensus grew around the human impact on global warming, industries and political groups with a vested interest in maintaining the status quo began to employ similar tactics around lying at scale. They funded research to challenge or undermine climate science, supported think tanks and lobbyists to influence public opinion and policy, and used media outlets to spread a narrative of uncertainty and skepticism.

Political consequences

The political consequences of climate change denial have been profound. In the United States and other countries, it has polarized the political debate over environmental policy, turning what is fundamentally a scientific issue into a partisan one. This politicization has hindered comprehensive national and global policies to combat climate change, as legislative efforts are often stalled by ideological conflicts.

a burning forest of climate change, by Midjourney

Denial campaigns have also influenced public opinion, creating a significant segment of the population that is skeptical of climate science years after overwhelming scientific consensus has been reached, which further complicates efforts to implement wide-ranging environmental reforms.

Current stakes and global impact

Today, the stakes of climate change denial could not be higher. As the world faces increasingly severe consequences of global warming β€” including extreme weather events, rising sea levels, and disruptions to ecosystems β€” the need for decisive action becomes more urgent. Yet, climate change denial continues to impede progress. By casting doubt on scientific consensus, it hampers efforts to build the broad public support necessary for bold environmental policies that may help thwart or mitigate some of the worst disasters.

Moreover, climate change denial poses a significant risk to developing countries, which are often the most vulnerable to climate impacts but the least equipped to adapt. Denialism in wealthier nations can lead to a lack of global cooperation and support needed to address these challenges comprehensively.

Moving forward: acknowledging the science and embracing action

To effectively combat climate change, it is crucial to recognize the roots and ramifications of climate change denial. Understanding its origins in the Big Tobacco disinformation strategy helps demystify the tactics used to undermine environmental science. It’s equally important to acknowledge the role of political and economic interests in perpetuating this denial — oil tycoon Charles Koch alone spends almost $1 billion per election cycle, heavily to climate deniers.

A climate change desert, by Midjourney

However, there is a growing global movement acknowledging the reality of climate change and the need for urgent action. From international agreements like the Paris Accord to grassroots activism pushing for change, there is a mounting push against the tide of denial.

Climate change denial, with its roots in the Big Tobacco playbook, poses a significant obstacle to global efforts to address environmental challenges. Its political ramifications have stalled critical policy initiatives, and its ongoing impact threatens global cooperation. As we face the increasing urgency of climate change, acknowledging and countering this denial is crucial for paving the way towards a more sustainable and resilient future.

Read more

Sockpuppets are fake social media accounts used by trolls for deceptive and covert actions, avoiding culpability for abuse, aggression, death threats, doxxing, and other criminal acts against targets.

In the digital age, the battleground for political influence has extended beyond traditional media to the vast, interconnected realm of social media. Central to this new frontier are “sockpuppet” accounts – fake online personas created for deceptive purposes. These shadowy figures have become tools in the hands of authoritarian regimes, perhaps most notably Russia, to manipulate public opinion and infiltrate the political systems of countries like the UK, Ukraine, and the US.

What are sockpuppet accounts?

A sockpuppet account is a fake online identity used for purposes of deception. Unlike simple trolls or spam accounts, sockpuppets are more sophisticated. They mimic real users, often stealing photos and personal data to appear authentic. These accounts engage in activities ranging from posting comments to spreading disinformation, all designed to manipulate public opinion.

The Strategic Use of Sockpuppets

Sockpuppet accounts are a cog in the larger machinery of cyber warfare. They play a critical role in shaping narratives and influencing public discourse. In countries like Russia, where the state exerts considerable control over media, these accounts are often state-sponsored or affiliated with groups that align with government interests.

Case Studies: Russia’s global reach

  1. The United Kingdom: Investigations have revealed Russian interference in the Brexit referendum. Sockpuppet accounts spread divisive content to influence public opinion and exacerbate social tensions. Their goal was to weaken the European Union by supporting the UK’s departure.
  2. Ukraine: Russia’s geopolitical interests in Ukraine have been furthered through a barrage of sockpuppet accounts. These accounts disseminate pro-Russian propaganda and misinformation to destabilize Ukraine’s political landscape, particularly during times of crisis, elections, or — most notably — during its own current war of aggression against its neighbor nation.
  3. The United States: The 2016 US Presidential elections saw an unprecedented level of interference. Russian sockpuppets spread divisive content, fake news, and even organized real-life events, creating an environment of distrust and chaos. Their goal was to sow discord and undermine the democratic process.
Vladimir Putin with his sheep, by Midjourney

How sockpuppets operate

Sockpuppets often work in networks, creating an echo chamber effect. They amplify messages, create false trends, and give the illusion of widespread support for a particular viewpoint. Advanced tactics include deepfakes and AI-generated text, making it increasingly difficult to distinguish between real and fake content.

Detection and countermeasures

Detecting sockpuppets is challenging due to their evolving sophistication. Social media platforms are employing AI-based algorithms to identify and remove these accounts. However, the arms race between detection methods and evasion techniques continues. Governments and independent watchdogs also play a crucial role in exposing such operations.

Implications for democracy

The use of sockpuppet accounts by authoritarian regimes like Russia poses a significant threat to democratic processes. By influencing public opinion and political outcomes in other countries, they undermine the very essence of democracy – the informed consent of the governed. This digital interference erodes trust in democratic institutions and fuels political polarization.

As we continue to navigate the complex landscape of digital information, the challenge posed by sockpuppet accounts remains significant. Awareness and vigilance are key. Social media platforms, governments, and individuals must collaborate to safeguard the integrity of our political systems. As citizens, staying informed and critically evaluating online information is our first line of defense against this invisible but potent threat.

Read more

Deep fakes, a term derived from “deep learning” (a subset of AI) and “fake,” refer to highly realistic, AI-generated digital forgeries of real human beings. These sophisticated imitations can be videos, images, or audio clips where the person appears to say or do things they never actually did.

The core technology behind deep fakes is based on machine learning and neural network algorithms. Two competing AI systems work in tandem: one generates the fake content, while the other attempts to detect the forgeries. Over time, as the detection system identifies flaws, the generator learns from these mistakes, leading to increasingly convincing fakes.

Deep fakes in politics

However, as the technology has become more accessible, it’s been used for various purposes, not all of them benign. In the political realm, deep fakes have a potential for significant impact. They’ve been used to create false narratives or manipulate existing footage, making it appear as though a public figure has said or done something controversial or scandalous. This can be particularly damaging in democratic societies, where public opinion heavily influences political outcomes. Conversely, in autocracies, deep fakes can be a tool for propaganda or to discredit opposition figures.

How to identify deep fakes

Identifying deep fakes can be challenging, but there are signs to look out for:

  1. Facial discrepancies: Imperfections in the face-swapping process can result in blurred or fuzzy areas, especially where the face meets the neck and hair. Look for any anomalies in facial expressions or movements that don’t seem natural.
  2. Inconsistent lighting and shadows: AI can struggle to replicate the way light interacts with physical objects. If the lighting or shadows on the face don’t match the surroundings, it could be a sign of manipulation.
  3. Audiovisual mismatches: Often, the audio does not perfectly sync with the video in a deep fake. Watch for delays or mismatches between spoken words and lip movements.
  4. Unusual blinking and breathing patterns: AI can struggle to accurately mimic natural blinking and breathing, leading to unnatural patterns.
  5. Contextual anomalies: Sometimes, the content of the video or the actions of the person can be a giveaway. If it seems out of character or contextually odd, it could be fake.

In democratic societies, the misuse of deep fakes can erode public trust in media, manipulate electoral processes, and increase political polarization. Fake videos can quickly spread disinformation and misinformation, influencing public opinion and voting behavior. Moreover, they can be used to discredit political opponents with false accusations or fabricated scandals.

In autocracies, deep fakes can be a potent tool for state propaganda. Governments can use them to create a false image of stability, prosperity, or unity, or conversely, to produce disinformation campaigns against perceived enemies, both foreign and domestic. This can lead to the suppression of dissent and the manipulation of public perception to support the regime.

Deep fakes with Donald Trump, by Midjourney

Response to deep fakes

The response to the threat posed by deep fakes has been multifaceted. Social media platforms and news organizations are increasingly using AI-based tools to detect and flag deep fakes. There’s also a growing emphasis on digital literacy, teaching the public to critically evaluate the content they consume.

Legal frameworks are evolving to address the malicious use of deep fakes. Some countries are considering legislation that would criminalize the creation and distribution of harmful deep fakes, especially those targeting individuals or designed to interfere in elections.

While deep fakes represent a remarkable technological advancement, they also pose a significant threat to the integrity of information and democratic processes. As this technology evolves, so must our ability to detect and respond to these forgeries. It’s crucial for both individuals and institutions to stay informed and vigilant against the potential abuses of deep fakes, particularly in the political domain. As we continue to navigate the digital age, the balance between leveraging AI for innovation and safeguarding against its misuse remains a key challenge.

Read more

The Moon landing hoax conspiracy theory posits that the United States faked the Apollo 11 Moon landing in 1969 as well as the subsequent Apollo missions. Despite overwhelming evidence to the contrary, proponents of this theory claim that NASA, with the possible assistance of other organizations, orchestrated a deception to win the Space Race against the Soviet Union during the Cold War.

Apollo moon landing hoax conspiracy theory, by Midjourney

Origin and Spread of the Theory

The theory took root in the early 1970s, gaining traction with the book “We Never Went to the Moon: America’s Thirty Billion Dollar Swindle” by Bill Kaysing, published in 1974. Kaysing, who had been a technical writer for a company that helped build the Saturn V rocket, argued that the technology to land on the Moon did not exist and that the Apollo missions were staged on Earth.

In the following decades, the conspiracy theory was perpetuated through books, documentaries, and internet forums. Notably, the 2001 Fox television special “Conspiracy Theory: Did We Land on the Moon?” brought renewed attention to these claims, featuring interviews with experts and conspiracy theorists.

Saturn V rocket lineup, by Midjourney

Main claims of the theory

  1. Photographic and video anomalies: Conspiracy theorists point to perceived inconsistencies in the Apollo mission photographs and videos. These include arguments about shadows and lighting, the absence of stars in lunar sky photos, and the appearance of the American flag, which seemed to flutter as if in the wind.
  2. Technical and scientific implausibility: Skeptics argue that the technology of the 1960s was not advanced enough for a Moon landing. They claim that the Van Allen radiation belts surrounding Earth would have been lethal to astronauts, and that the lunar module could not have functioned as claimed.
  3. Political motives: At the height of the Cold War, the United States was locked in a technological and ideological battle with the Soviet Union. Landing on the Moon would assert American dominance in space technology. Conspiracy theorists suggest that this was a compelling motive for the U.S. government to fabricate the Moon landings.

Counterarguments and evidence against the theory

The Moon landing conspiracy theory has been extensively debunked by scientists, astronauts, and historians. Key counterarguments include:

  1. Technical rebuttals: Scientific explanations have been provided for each of the supposed anomalies in the Apollo mission photos and videos. For example, the absence of stars is attributed to the camera’s exposure settings, and the peculiar behavior of the flag is explained by the way it was constructed and moved.
  2. Third-party evidence: Independent tracking of the Apollo missions by several countries and the presence of reflectors on the Moon’s surface, placed there during the Apollo missions and still used for laser ranging experiments, provide evidence of the landings.
  3. Feasibility of a hoax: The scale of the alleged deception would have required the involvement and silence of thousands of people, including NASA employees and contractors, which experts argue is highly improbable. Furthermore, the Soviet Union, America’s primary competitor in space, never contested the Moon landings, which they likely would have if there were any evidence of a hoax.
Moon landing probe by Midjourney

Cultural Impact

The Moon landing conspiracy theory is often cited as an example of modern pseudoscience and the influence of misinformation. It reflects a broader public skepticism towards government and scientific authorities, amplified in the digital age by the internet and social media. The persistence of this theory highlights the challenges of combating false information and the importance of critical thinking and media literacy.

Conclusion

While the Moon landing hoax conspiracy theory continues to have adherents, it is overwhelmingly dismissed by the scientific community and regarded as a case study in conspiracy thinking. The Apollo Moon landings remain one of humanity’s most significant technological achievements, backed by a wealth of evidence and scientific consensus. The theory, however, serves as a reminder of the ongoing need to educate the public about scientific methodology and the evaluation of evidence.

Read more

the deep state

The “deep state” conspiracy theory, particularly as it has been emphasized by supporters of former President Donald Trump, alleges the existence of a hidden, powerful network within the U.S. government, working to undermine and oppose Trump’s presidency and agenda. In reality, the epithet is an elaborate way of discrediting the non-partisan civil service personnel who are brought in to government for their expertise and competence, who typically remain in their posts through Presidential transitions regardless of which party is occupying the White House.

The deep state gathers in front of the US Capitol, by Midjourney

Origin of the deep state meme

The term “deep state” originated in Turkey in the 1990s, referring to a clandestine network of military officers and their civilian allies who, it was believed, were manipulating Turkish politics. In the American context, the term was popularized during the Trump administration as a meme, evolving to imply a shadowy coalition — echoing other popular conspiracy theories such as the antisemitic global cabal theory — within the U.S. government, including intelligence agencies, the civil service, and other parts of the bureaucracy.

Main claims

  1. Bureaucratic opposition: The theory posits that career government officials, particularly in intelligence and law enforcement agencies, are systematically working against Trump’s interests. This includes alleged sabotage of his policies and leaking information to the media.
  2. Manipulation of information: Proponents believe that these officials manipulate or withhold information to influence government policy and public opinion against Trump.
  3. Alleged connections with other theories: The deep state theory often intersects with other conspiracy theories, like those surrounding the investigation of Russian interference in the 2016 election and the impeachment proceedings against Trump. It suggests these events were orchestrated by the deep state to discredit or destabilize his presidency.

Contextual factors

  1. Political polarization: The rise of the deep state theory is partly attributed to the increasing political polarization in the U.S. It serves as a narrative to explain and rally against perceived opposition within the government.
  2. Media influence: Certain media outlets and social media platforms have played a significant role in propagating this theory. It’s often amplified by commentators who support Trump, contributing to its widespread dissemination among his base.
  3. Trump’s endorsement: Trump himself has referenced the deep state, particularly when discussing investigations into his administration or when responding to criticism from within the government.

Criticism and counterarguments to deep state “theory”

  1. Lack of concrete evidence: Critics argue that the deep state theory lacks substantial evidence. They contend that routine government processes, checks and balances, and the separation of powers are mischaracterized as clandestine operations.
  2. Undermining trust in institutions: There’s concern that such theories undermine public trust in vital governmental institutions, particularly those responsible for national security and law enforcement.
  3. Political tool: Detractors view the deep state concept as a political tool used to dismiss or discredit legitimate investigation and opposition.
Deep state conspiracy theory, as illustrated by Midjourney

Impact on governance and society

  1. Influence on supporters: For many Trump supporters, the deep state theory provides an explanatory framework for understanding his political challenges and defeats. It galvanizes his base by portraying him as an outsider battling corrupt, entrenched interests.
  2. Public trust and conspiracism: The theory contributes to a broader erosion of trust in government and institutions, fostering a climate where conspiratorial thinking becomes more mainstream.
  3. Policy implications: Belief in the deep state can impact policy discussions and decisions, as it frames certain government actions and policies as inherently suspect or malicious.

Comparative perspective

Globally, similar theories exist in various forms, often reflecting local political and historical contexts. They typically emerge in situations where there is a distrust of the political establishment and are used to explain perceived injustices or power imbalances.

The deep state conspiracy theory as espoused by Trump’s MAGA movement plays a significant role in current American political discourse, impacting public perception of government, policy debates, and the broader social and political climate. Its lack of verifiable evidence and potential to undermine democratic institutions make it a dangerous propaganda prop applied recklessly by the current GOP frontrunner for the 2024 nomination.

Books on conspiracy theories

More conspiracy theories

Read more

PizzaGate originated in 2016 from the hacked emails of John Podesta, Hillary Clinton‘s campaign manager, published by WikiLeaks. Internet users on platforms like 4chan and Reddit began to interpret these emails, focusing on those that mentioned pizza and other food items. They falsely claimed these were code words for a child sex trafficking ring operated by high-ranking Democratic Party members and associated with a Washington, D.C., pizzeria named Comet Ping Pong.

The theory was fueled by various coincidences and misinterpretations. For instance, references to pizza were interpreted as part of a secret code, and the pizzeria’s quirky artwork was misconstrued as sinister symbolism. Despite the lack of credible evidence, these interpretations quickly gained traction online.

PizzaGate conspiracy theory, imagined by Midjourney

The broader political context

PizzaGate should be understood within the broader political context of the 2016 U.S. presidential election. This period was marked by intense partisanship and the proliferation of disinformation and fake news, with social media acting as a catalyst. The theory emerged against the backdrop of a highly contentious election, with Hillary Clinton as a polarizing figure. In such a climate, conspiracy theories found fertile ground to grow, particularly among those predisposed to distrust the political establishment.

Impact and aftermath

The most immediate and dangerous impact of PizzaGate was an incident in December 2016, when Edgar Maddison Welch, motivated by the conspiracy theory, fired a rifle inside Comet Ping Pong. Fortunately, there were no injuries. This incident highlighted the real-world consequences of online conspiracy theories and underscored the potential for online rhetoric to inspire violent actions.

In the aftermath, social media platforms faced criticism for allowing the spread of baseless allegations. This led to discussions about the role of these platforms in disseminating fake news and the balance between free speech and the prevention of harm.

Lasting effects

PizzaGate had several lasting effects:

  1. Polarization and distrust: It exacerbated political polarization and distrust towards mainstream media and political figures, particularly among certain segments of the population.
  2. Conspiracy culture: The incident became a significant part of the modern conspiracy culture, linking it to other conspiracy theories and contributing to a growing skepticism of official narratives.
  3. Social media policies: It influenced how social media companies manage content, leading to stricter policies against misinformation and the promotion of conspiracy theories.
  4. Public awareness: On a positive note, it raised public awareness about the dangers of misinformation and the importance of critical thinking in the digital age.
  5. Legitimacy of investigations: The theory, though baseless, led some people to question the legitimacy of genuine investigations into sexual misconduct and abuse, potentially undermining efforts to address these serious issues.

Caveat, Internet

PizzaGate serves as a stark reminder of the power of the internet to spread misinformation and the real-world consequences that can ensue. It reflects the complexities of the digital age, where information, regardless of its veracity, can be disseminated widely and rapidly. As we continue to navigate this landscape, understanding phenomena like PizzaGate becomes crucial in fostering a more informed and discerning online community — as well as thwarting the march of fascism.

Read more

ParadoxBot is an adorable chatbot who will cheerfully inform you about the Dark Arts

Sure, you could use the site search. Or, you could have a bot — try having a conversation with my blog via the following AI chatbot, ParadoxBot.

Ask it about conspiracy theories, or narcissism, or cults, or authoritarianism, or fascism, or disinformation — to name a few. You can also ask it about things like dark money, economics, history, and many topics at the intersection of political psychology.

It doesn’t index what’s on Foundations (yet) but it has ingested this site and you can essentially chat with the site itself via the ChatGPT-like interface below. Enjoy! And if you love it or hate it, find me on BlueSky (as @doctorparadox) or Mastodon and let me know your thoughts:

Tips for using ParadoxBot

  • Follow general good practice regarding prompt engineering.
  • If you don’t get an answer right away, try rephrasing your question. Even omitting or adding one word sometimes produces good results.
  • Try broad broad and specific types of queries.
  • Dig deeper into any areas the bot turns up that sound interesting.
Read more

“Love bombing” is a manipulative tactic employed to gain emotional control over an individual by showering them with affection, compliments, and promises. This technique is often used by both narcissists and cults, often for similar objectives — to overwhelm a target with positive feelings, in order to secure their loyalty. Understanding the nuances of love bombing can be crucial for identifying and avoiding this core tactic of emotional predators.

Love bombing by narcissists

Narcissists use love bombing as a way to quickly establish emotional dependency. They may shower their target with gifts, compliments, and an overwhelming amount of attention. This is often done during the “honeymoon phase” of a relationship, creating an illusion of a perfect partner who is deeply in love.

How to identify love bombing by a narcissist:

  1. Intensity: The affection and attention feel overwhelming and come on very strong.
  2. Rapid progression: The relationship moves quickly, often skipping normal stages of emotional intimacy.
  3. Idealization: You are put on a pedestal, and any flaws you have are either ignored or spun into positive traits.

How to avoid it:

  1. Pace yourself: Slow down the relationship and insist on a more typical progression.
  2. Seek outside opinions: Consult trusted friends or family about the relationship, and share your misgivings about its pace of progression.
  3. Set boundaries: Make your limits clear and stick to them. If someone is pushing back and not respecting the boundaries you set, it is yet another red flag of potential narcissistic personality disorder (NPD) traits.

Love bombing by cults

In the context of cults and cultish groups, love bombing serves to recruit and retain members. As one of a host of different influence techniques, newcomers are often greeted with extreme enthusiasm, given immediate friendship, and showered with positive affirmation. The objective is to create a euphoric emotional state that is then associated with the cult — making it harder to leave later, when the cracks begin to show.

How to identify love bombing by a cult or high-demand group:

  1. Instant community: You receive immediate acceptance and friendship from multiple members.
  2. Unconditional affection: Love and acceptance seem to be given freely, without the need for personal growth or change.
  3. Isolation: Efforts to separate you from your existing support network and even your family, making you dependent on the cult for emotional support.

How to avoid it:

  1. Be skeptical: Question why you’re receiving so much attention and what the group might want in return.
  2. Research: Look into the group’s history, beliefs, and any reports or articles about them.
  3. Maintain outside connections: Keep in touch with your existing network and consider their opinions. The group may encourage secrecy, but sharing your experiences outside the group and getting a wider perspective on them is critical.

General tips for avoiding love bombing

  1. Trust your instincts: If something feels too good to be true, it probably is.
  2. Time: Time is your ally. Manipulators often need you to make quick decisions. The more time you take, the more likely you are to see inconsistencies in their behavior.
  3. Consult with trusted individuals: Sometimes an outside perspective can provide invaluable insights that you might have missed.

Understanding the mechanics of love bombing is the first step in protecting yourself from falling into such emotional manipulation traps. By being aware of the signs and knowing how to counteract them, you can maintain control over your emotional well-being.

Read more

Donald Trump pathocracy, by Midjourney

Pathocracy is a relatively lesser-known concept in political science and psychology, which refers to a system of government in which individuals with personality disorders, particularly those who exhibit psychopathic, narcissistic, and similar traits (i.e. the “evil of Cluster B“), hold significant power. This term was first introduced by Polish psychiatrist Andrzej Łobaczewski in his work “Political Ponerology: A Science on the Nature of Evil Adjusted for Political Purposes.”

The crux of pathocracy lies in the rule by a small pathological minority, which imposes a regime that is damaging to the majority of non-pathological people. The key characteristics of pathocratic leadership include a lack of empathy, a disregard for the rule of law, manipulation, authoritarianism, and often, brutal repression.

Origins and development of the concept of pathocracy

Pathocracy emerges from Łobaczewski’s study of totalitarian regimes, particularly those of Nazi Germany under Adolf Hitler and Communism in the Soviet Union under Joseph Stalin. Born in Poland in 1921, he witnessed the upheaval and transformation of his own country during the horrors of World War II and subsequent Communist occupation.

He suffered greatly to arrive at the insights in his work — arrested and tortured by the Polish authorities under Communist rule, he was unable to publish his magnum opus, the book Political Ponerology, until he escaped to the United States during the 1980s. Łobaczewski spent the rest of his life and career trying to unpack what had happened to him, his community, and his nation — such brutality over such a shockingly short span of time.

Łobaczewski posits that these authoritarian and fascist regimes were not merely politically oppressive, but were also psychologically abnormal. He studied the characteristics of these leaders and their closest supporters, identifying patterns that aligned with known personality disorders. His work also identified a much higher percentage of personality disordered individuals than is still commonly understood, finding that about 7% of the general population could be categorized as severely lacking in empathy and possessing the tendencies — latent or overt — leading to the rise of pathocracy in society.

Characteristics of pathocratic leadership

  • Psychopathy: Leaders in a pathocracy often display traits synonymous with psychopathy, including a lack of empathy, remorse, and shallow emotions.
  • Narcissism: Excessive self-love and a strong sense of entitlement often drive pathocratic rulers.
  • Manipulation: These leaders are adept at manipulation, using deceit and coercion to maintain their power. They also often exhibit other traits and behaviors of emotional predators.
  • Paranoia: A heightened sense of persecution or conspiracy is common, leading to oppressive and authoritarian measures.
  • Corruption: Moral depravity, ethical degeneration, and widespread corruption are endemic in a pathocracy, as pathological leaders tend to surround themselves with similarly affected individuals who feel no shame about performing unethical and/or illegal actions either in secret, or in broad daylight with little fear of retaliation.
Continue reading Pathocracy Definition: Are we in one?
Read more