Notes: Surveillance capitalism dystopia, with Zeynep Tufekci

speak, sistah!

see also: Shoshanna Zuboff (who wrote the seminal work on surveillance capitalism), Don Norman

Some takeaways:

  • surveillance won’t be obvious and overt like in Orwell’s 1984 — it’ll be covert and subtle (“more like a spider’s web”)
  • social networks use persuasion architecture — the same cloying design aesthetic that puts gum at the eye level of children in the grocery aisle

Example:

AI modeling of potential Las Vegas ticket buyers

The machine learning algorithms can classify people into two buckets, “likely to buy tickets to Vegas” and “unlikely to” based on exposure to lots and lots of data patterns. Problem being, it’s a black box and no one — not even the computer scientists — know how it works or what it’s doing exactly.

So the AI may have discovered that bipolar individuals just about to go into mania are more susceptible to buying tickets to Vegas — and that is the segment of the population they are targeting: a vulnerable set of people prone to overspending and gambling addictions. The ethical implications of unleashing this on the world — and routinely using and optimizing it relentlessly — are staggering.

Profiting from extremism

“You’re never hardcore enough for YouTube” — YouTube gives you content recommendations that are increasingly polarized and polarizing, because it turns out that preying on your reptilian brain makes you keep clicking around in the YouTube hamster wheel.

The amorality of AI — “algorithms don’t care if they’re selling shoes, or politics.” Our social, political, and cultural flows are being organized by these persuasion architectures — organized for profit; not for the collective good, not for public interests, not subject to our political will anymore. These powerful surveillance capitalism tools are running mostly unchecked, with little oversight and with few people minding the ethics of the stores of essentially a cadre of Silicon Valley billionaires.

Intent doesn’t matter — good intentions aren’t enough; it’s the structure and business models that matter. Facebook isn’t a half trillion dollar con: its value is in its highly effective persuasion power, which is highly troubling and concerning in a supposedly democratic society. Mark Zuckerberg may even ultimately mean well (…debatable), but it doesn’t excuse the railroading over numerous obviously negative externalities resulting from the unchecked power of Facebook in not only the U.S., but in countries around the world including highly volatile domains.

Extremism benefits demagogues — Oppressive regimes both come to power by and benefit from political extremism; from whipping up citizens into a frenzy, often against each other as much as against perceived external or internal enemies. Our data and attention are now for sale to the highest bidding authoritarians and demagogues around the world — enabling them to use AI against us in election after election and PR campaign after PR campaign. We gave foreign dictators even greater powers to influence and persuade us in ways that benefit them at the expense of our own self-interest.

Design guru Don Norman’s shortlist of everything wrong with the internet

When usability pioneers have All the Feels about the nature of our creeping technological dystopia, how we got here, and what we might need to do to right the ship, it’s wise to pay attention. Don Norman’s preaching resonated with my choir, and they’ve asked me to sing a summary song of our people in bulleted list format:

  • What seemed like a virtuous thing at the time — building the internet with an ethos of trust and openness — has led to a travesty via lack of security, because no one took bad actors into account.
  • Google, Facebook, et al didn’t have the advertising business model in mind a priori, but sort of stumbled into it and got carried away giving advertisers what they wanted — more information about users — without really taking into consideration the boundary violations of appropriating people’s information. (see Shoshana Zuboff’s definitive new book on Surveillance Capitalism for a lot more on this topic)
  • Tech companies have mined the psychological sciences for techniques that — especially at scale — border on mass manipulation of fundamental human drives to be informed and to belong. Beyond the creepy Orwellian slant of information appropriation and emotional manipulation, the loss of productivity and mental focus from years of constant interruptions takes a toll on society at large.
  • We sign an interminable series of EULAs, ToS’s and other lengthy legalese-ridden agreements just to access the now basic utilities that enable our lives. Experts refer to these as “contracts of adhesion” or “click-wrap,” as a way of connoting the “obvious lack of meaningful consent.” (Zuboff)
  • The “bubble effect” — the internet allows one to surround oneself completely with like-minded opinions and avoid ever being exposed to alternative points of view. This has existential implications for being able to inhabit a shared reality, as well as a deleterious effect on public discourse, civility, and the democratic process itself.
  • The extreme commercialization of almost all of our information sources is problematic, especially in the age of the “Milton Friedman-ification” of the economic world and the skewing of values away from communities and individuals, towards a myopic view of shareholder value and all the attendant perverse incentives that accompany this philosophical business shift over the past 50 years. He notes that the original public-spiritedness of new communication technologies has historically been co-opted by corporate lobbyists via regulatory capture — a subject Tim Wu explores in-depth in his excellent 2011 book, “The Master Switch: The Rise and Fall of Information Empires.

Is it all bleak, Don?! His answer is clear: “yes, maybe, no.” He demurs on positing a definitive answer to all of these issues, but he doesn’t really mince words about a “hunch” that it may in fact involve burning it all down and starting over again.

Pointing to evolution, Norman notes that we cannot eke radical innovation out of incremental changes — and that when radical change does happen it is often imposed unexpectedly from the outside in the form of catastrophic events. Perhaps if we can’t manage to Marie Kondo our way to a more joyful internet, we’ll have to pray for Armageddon soon…?! 😱