This was posted on catholic easter sunday on the ssc subreddit. It's a posted-on-April 1st-for-plausible-deniability siskind post from back in 2018, where he outlines a kind of argument about how an all-powerfull entity that's God in all but name (and obviously emanated from a culture discovering AGI) is actually "logically necessary".
He calls the whole thing "The Hour I First Believed". I think it's notable for being a bit of a treasure trove of rationalist weird accepted truths, such as:
- All copies of a consciousness share a self, because consciousness is like an equation, or something:
But if consciousness is a mathematical object, it might be that two copies of the same consciousness are impossible. If you create a second copy, you just have the consciousness having the same single stream of conscious experience on two different physical substrates.
Which is both the original transhumanist cope to enable so-called consciousness upload so it's not just copying a simulacrum of your personality to a computer while you continue to rot away, and also what makes the basilisk torturing you possible.
- And it's corollary, Simulation Capture:
This means that an AI can actually “capture” you, piece by piece, into its simulation. First your consciousness is just in the real world. Then your consciousness is distributed across one real-world copy and a million simulated copies. Then the AI makes the simulated copies slightly different, and 99.9999% of you is in the simulation.
which is a kind of nuts I hadn't happened upon before.
There's also a bunch of rationalist decision theory stuff which I think make obvious how they were concocted to serve this type of narrative in the first place, instead for being broadly useful, Yud posing as a decision theory trailblazer notwithstanding.
I am not reading a SlateStar essay early on a Monday, but I think this is a response to Yud's teaching that a copy of you is really you so Colossus can really bring you back to live in digital heaven / hell. '90s Star Trek had some episodes about 'what if the transporter makes two copies of you?' Scott Alexander / SlateScott avoids talking about Yudkowsky's ideas in detail, I used to think he saw Yudkowsky as someone who got the rubes in the door to hear the good word about race and IQ, but then they worked on AI 2027 together. https://pivot-to-ai.com/2025/08/17/ai-doomsday-and-ai-heaven-live-forever-in-ai-god/
It's not so much a response as it is just running with it until you hit the concepts of the soul and the godhead face first.
edit: it's also mercifully short, like not even 3k words.
Although not by accident: Scott Alexander is a practicing Jew and Unsong is the kind of thing that someone interested in theology and Neo-Platonism writes. So I think he knows his friends are recapitulating Christianity, but if he backed away from them over that, they might back away from him reinventing social Darwinism and eugenics.