Longtermism

Longtermism is an extreme ideology that has gained traction in Silicon Valley and the technosphere: both Elon Musk and Peter Thiel are acolytes. In this worldview, the far future of humanity will have colonized the stars and number in the trillions — therefore making all the puny little humans alive today essentially worthless and expendable in their eyes (except themselves, of course). As long as climate change doesn’t kill *absolutely all* 7 billion of us, we’ll manage to soldier on — therefore we should focus on AI instead, they say.

Their breezy tossing aside of morality on anything with effects less than 100 years is also chilling. By use of the frame-shifting device of the far far future, longtermism is able to render basically anything a rounding error of no importance, from the Holocaust to the dropping of atomic bombs to the famines of Stalin and Mao. That is just not going to sit well with most people who have empathy — which is most people.

Related terms:

  • existential risk
  • effective altruism
  • non-linear climate change
  • “vast and glorious potential”

Comments are closed.