njms.ca

How I escaped longtermism

Published on 2024-01-04


Recently I was listening to an episode of Tech Won't Save Us:

How Effective Accelerationism Divides Silicon Valley

In it, Paris Marx talks with Émile P. Torres about a number of Silicon Valley-type ideologies, importantly for the purposes of this gemlog entry: longtermism, transhumanism and singularitarianism.

One thing you may not know about me is that back in high school, I was a card-carrying transhumanist and singularitarian. Like, if you'd asked (you almost certainly wouldn't have, I hope), I would have described myself as a transhumanist and singularitarian.

As for longtermism, I don't think I made it there. I was pretty close to it, though. I was aware of it. I almost made the jump. I don't know what specific event, if any, stopped me, but I never made it. Thank fucking God.

If you aren't familiar with those names and you want to know what I personally think they mean:

  • Transhumanism is the belief that humans should "transcend" the human body. This usually comes in the form of cyborg shit and mind uploading, but when they aren't low-key to high-key white supremacists they might also care about things like HRT and other medical interventions that can change the body for the better.
  • Singularitarianism is the belief in a mystical future event called the Singularity, which is kind of like the rapture for tech bros. The Singularity is loosely defined, but we can think of it as the moment computers "surpass" the intelligence of humans, at which point they believe computers will rapidly and explosively improve their own intelligence, possibly reaching God-hood in a rather short period of time. The general idea is that life after the Singularity will be so different that we can't even begin to imagine what it'll look like today.
  • Longtermism is a radical extension of effective altruism, which is also kind of vague but we can think of it as using scientific or mathematical decision making to answer moral and ethical questions. The central question longtermism seeks to answer is, as Marx and Torres put it: if you had the choice between subjecting one person to horrors of torture beyond our imagination for millions of years, or subjecting an unimaginably large number of people (think, ten to the quintillionth power) to the discomfort of a single speck of dust landing on their shoulder, what should we choose? The longtermist would choose to torture the person for millions of years.

And, for good measure, my personal critique of these three ideologies:

  • Transhumanism is rarely, if ever, for the masses. Practically, if the people with the means to organize society towards achieving the aims of transhumanism were to do so, the vast majority of society won't be benefiting from it. I sure as hell wouldn't. I also think there's an open question of whether or not we should be "transcending" our human bodies, but I definitely still understand the appeal after all these years. Especially as a transgender person. Also, it's probably worth noting that transhumanism has a long tradition of being pro-eugenics, unsurprisingly.
  • Singularitarianism is predicated on the singularity happening. It's like a religious belief in many ways, as Marx and Torres explore in the podcast. There's nothing wrong with that intrinsically, I just don't personally believe it's going to happen anymore. Exponential growth is pretty rare in nature. And if it did happen, I'm extremely unconvinced that it'd be a good thing.
  • Longtermism... well... I don't really have anything to say about it, besides that it should definitely feel absurd. It is absurd. Yeah, if you were to quantify happiness and pleasure absolutely and apply Bayesian reasoning then maybe it'd be an awesome idea to torture that person for a million years, but I think that's probably a better argument for why we absolutely should not ignorantly apply "math" to fundamentally subjective problems. Math operates in the realm of math, not human ethics.

In hindsight, I almost certainly got into transhumanism because I'm transgender. I wanted transhumanism because I wanted to try out different genders. That was my "in," though I definitely also felt a lot of what Marx and Torres discussed, that is, that people who get into this stuff do so out of a need for some lost religiosity in their lives. People in the West are growing more and more secular, more atheist, and are only now starting to realize why exactly people invented this whole religion thing in the first place.

It's nice! It's nice to think that things matter! That if things suck now, then one day we'll be saved!

In many ways, these tech bro religions are kind of like our contemporary incarnation of science-as-religion. Or more specifically, technology-as-religion.

I got into singularitarianism (god I have to be careful when spelling out that word) somewhat on accident. I was looking for books on transhumanism, and people online pointed me towards "The Singularity is Near" by Raymond Kurzweil, a man who is ABSOLUTELY transgender Ray if you're reading this please just listen to yourself talk. In it, he defined "singularitarian" as someone who's given the Singularity a good thought and strives to be prepared for it. That sounded a lot like me at the time.

As for longtermism, again, that was kind of a near miss. Lots of this thinking is organized around a website called Less Wrong. I followed it for a bit, and thought about trying to join in on the discussion, but that never ended up happening. I think what might have pushed me away was that I was aware they kind of had a Nazi problem. I was interested in their discussions around the philosophy of AI; that was something I was really into at the time, but I always felt like there was a background radiation of bigotry, and that I wouldn't be welcome.

So, I could have gone much further down the pipe, but I was held back.

I think what ultimately turned me away from it was the realization that this stuff just wasn't for me. Not in that, it didn't interest me, but in that people weren't "doing" this stuff for people like me. Up until that point in life, I'd been treated like I was better than everyone else. I had the high schooler god complex that grips too many former gifted kids. For one, I'm white. That came with a lot of privilege. My family would be able to send me to university, and encouraged me to do so. And, having lucked out and gotten STEM autism, I did pretty alright in school. I definitely didn't have an easy life, but I wasn't being treated with the same kind of abuse that gives you a greater sense of humility at the time. One thing lead to another and I started having these very lofty visions for the future and for myself. Once I came out, once I was old enough to come to grips with the material conditions of my life, and once I started to realize how Kafkaesque the world I was living in truly was, it was much more obvious that I wasn't going to be on the list of people to have their minds uploaded to Heaven on Earth.

For a while, I had nothing.

I really didn't like that.

More recently though, I got to the other end of the tunnel of atheistic nihilism; that is, I started to embrace the value of religion again.

Longtermism works well as a religion for a lot of people. The problem is, it's a legitimately dangerous philosophy for people to have--especially people in power.

I don't think it really matters what religion it is you choose to embrace, but believing in something rather than nothing, no matter how silly or irrational it is, can pay off quite a bit. It only becomes problematic when you allow that religion to lead you to do harm to others. You don't need to do that. Even if you're following a religion that is infamous for doing harm to others, like Christianity, then there are ways you can re-imagine its stories to be more pro-social without sacrificing the value its traditions bring to your life.

It's nice to believe in things.

Respond to this article

If you have thoughts you'd like to share, send me an email!

See here for ways to reach out