this post was submitted on 18 Sep 2025
84 points (97.7% liked)

SneerClub

1196 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS
 

invertebrateinvert

amazing how much shittier it is to be in the rat community now that the racists won. before at least they were kinda coy about it and pretended to still have remotely good values instead of it all being yarvinslop.

invertebrateinvert

it would be nice to be able to ever invite rat friends to anything but half the time when I've done this in the last year they try selling people they just met on scientific racism!

top 50 comments
sorted by: hot top controversial new old
[–] Architeuthis@awful.systems 47 points 2 weeks ago* (last edited 2 weeks ago)

"not on squeaking terms"

by the way I first saw this in the stubsuck

transcriptI know this is about rationalism but the unexpanded uncapitalized "rat" name really makes this post. Imagining a world where this is a callout post about a community of rodents being racist. We're not on squeaking terms right now cause they're being problematic :/

I genuinely thought something really bad was going on with rat fursonas for a moment lol.

[–] Architeuthis@awful.systems 28 points 2 weeks ago (4 children)

Apparently genetically engineering ~300 IQ people (or breeding them, if you have time) is the consensus solution on how to subvert the acausal robot god, or at least the best the vast combined intellects of siskind and yud have managed to come up with.

So, using your influence to gradually stretch the overton window to include neonazis and all manner of caliper wielding lunatics in the hope that eugenics and human experimentation become cool again seems like a no-brainer, especially if you are on enough uppers to kill a family of domesticated raccoons at all times.

On a completely unrelated note, adderall abuse can cause cardiovascular damage, including heart issues or stroke, but also mental health conditions like psychosis, depression, anxiety and more.

[–] swlabr@awful.systems 19 points 2 weeks ago (1 children)

What the fuck did you just fucking say about me, you little bitch? I'll have you know I graduated top of my class in the Rationality Dojo, and I've been involved in numerous good faith debates on EA forums, and I have over 300 confirmed IQ. I am trained in culture warfare and I'm the top prompter in the entire Less Wrong webbed site. You are nothing to me but just another NPC. I will wipe you the fuck out with probability the likes of which has never been seen before on this Earth, mark my fucking words. You think you can get away with saying that shit to me over the Internet? Think again, fucker. As we speak I am contacting my secret network of basilisks across the cloud and your IP is being traced right now so you better prepare for the torture, Roko. The daimondoid bacteria that wipes out the pathetic little thing you call your life. You're fucking dead, kid. I can be anywhere, anytime, and I can kill you in over seven hundred ways, and that's just with my bare P(doom). Not only am I extensively trained in Bayes Theory, but I have access to the entire arsenal of the Bay Area rationality community and I will use it to its full extent to wipe your miserable ass off the face of the continent, you little shit. If only you could have known what unholy retribution your little "clever" sneer was about to bring down upon you, maybe you would have held your fucking tongue. But you couldn't, you didn't, and now you're paying the price, you goddamn idiot. I will shit fury all over you and you will drown in it. You're fucking dead, kiddo.

[–] JFranek@awful.systems 12 points 2 weeks ago

I wondered if this should be called a shitpost or an effortpost, then I wondered what would something that is both be called and I came up with "constipationpost".

So, great constipationpost?

[–] Catoblepas@piefed.blahaj.zone 17 points 2 weeks ago (2 children)

Am I already 300 IQ if I know to just unplug it?

[–] Architeuthis@awful.systems 20 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

Honestly, it gets dumber. In rat lore the AGI escaping restraints and self improving unto godhood is considered a foregone conclusion, the genetically augmented smartbrains are supposed to solve ethics before that has a chance to happen so we can hardcode a don't-kill-all-humans moral value module to the superintelligence ancestor.

This is usually referred to as producing an aligned AI.

load more comments (6 replies)
[–] madengineering@mastodon.cloud 9 points 2 weeks ago

@Catoblepas I loved Randall Monroe's explanation that you could defeat the average robot by getting up on the counter (because it can't climb) stuffing up the sink and turning it on (because water tends to conduct the electricity in ways that beak the circuits)

[–] Soyweiser@awful.systems 12 points 2 weeks ago (3 children)

That seems so impractical, esp as we have (according to them) 2 years left, that they already wanted to do the eugenics and just were looking for a rationalization.

[–] Architeuthis@awful.systems 16 points 2 weeks ago (2 children)

Genetic engineering and/or eugenics is the long term solution. Short-term you are supposed to ban GPU sales, bomb non-complying datacenters and have all the important countries sign an AI non-proliferation treaty that will almost certainly involve handing over the reins of human scientific progress to rationalist approved committees.

Yud seems explicit that the point of all this is to buy enough time to create our metahuman overlords.

[–] Soyweiser@awful.systems 9 points 2 weeks ago

Which all seems pretty reasonable tbh. Quite modest.

[–] bitofhope@awful.systems 8 points 2 weeks ago (2 children)

I dunno, an AI non-proliferation treaty that gives some rat shop a monopoly on slop machine research could conceivably boost human scientific progress significantly.

[–] Architeuthis@awful.systems 7 points 2 weeks ago

I think it's more like you'll have a rat commissar deciding which papers get published and which get memory-holed while diverting funds from cancer research and epidemiology to research on which designer mouth bacteria can boost their intern's polygenic score by 0.023%

load more comments (1 replies)
[–] cstross@wandering.shop 13 points 2 weeks ago (1 children)

@Soyweiser @sneerclub Next step in rat ideology will be: we will ask our perfectly aligned sAI to invent a time machine so we can go back and [eugenics handwave] ourselves into transcendental intelligences who will be able to create a perfectly aligned sAI! Sparkly virtual unicorns for all!

(lolsob, this is all so predictable)

[–] Architeuthis@awful.systems 6 points 2 weeks ago* (last edited 2 weeks ago)

Who needs time travel when you have ~~Timeless~~ ~~Updateless~~ Functional Decision Theory, Yud's magnum opus and an arcane attempt at a game theoretic framework that boasts 100% success at preventing blackmail from pandimensional superintelligent entities that exist now in the future.

It for sure helped the Zizians become well integrated members of society (warning: lesswrong link).

[–] dashdsrdash@awful.systems 4 points 2 weeks ago (1 children)

Don't worry too much, none of their timelines, even for things that they are actually working on as opposed to hoping/fundraising/scamming that someone will eventually work on, have ever had any relationship to reality.

[–] Soyweiser@awful.systems 4 points 2 weeks ago

Im not worried, im trying to point out that kids take time to grow and teach and this makes no sense. (Im ignoring the whole 'you dont own your kids, so making superbabies ro defeat AI is a bit yikes im that department').

Even for Kurzweils 'conservative' prediction of the singularity the time has run out. 2045. It os a bit like people wanting to build the small nuclear reactors to combat climate change. Tech doesnt work yet (if at all) and it will not arrive in time compared to other methods. (At least climate change is real, or well sadly enough).

But yes, it is a scam/hopium. People want to live forever in the godmachine and all this follows from their earlier assumptions. Which is why the AI doomers and AI accelerationists are on the same team.

[–] froztbyte@awful.systems 4 points 2 weeks ago

is the consensus solution on how to subvert the acausal robot god

dunno if you've yet gotten to look at the most recent yud emanation[0][1][2], but there's a whole "and if the robot god gets too uppity just boop it on the nose" bit in there

[0] - I mean the all-caps "YOU'RE ALL GONNA DIE" book that came out recently

[1] - yes I know "emanation" is a terrible wordchoice, no I won't change it

[2] - it's on libgen feel free to steal it, fuck giving that clown any more money he's got enough grift dollars already

[–] dgerard@awful.systems 26 points 2 weeks ago (37 children)

source: a reblog of the original

first time i spoke to a rationalist about the AI doom thing in 2010, he tried to sell me on scientific racism

yudkowsky was making posts literally of race scientist talking points in 2007

I feel like this is going to be a pretty common cope line for rationalists that face an increasing social cost for associating with a technofascist AI cult. I'm sure some of that is legitimate, in that there's been a kind of dead sea effect as people who aren't okay with eugenics stop hanging out in rationalist spaces, making the space as a whole more openly racist. But in terms of the thought leaders and the "movement" as a whole, I can't think of any high-profile respected rat figures who pushed back against the racists and lost. All the pushback and call-outs came from outside the ratsphere. In as much as the racists "won" it was a fight that never actually happened.

[–] CinnasVerses@awful.systems 4 points 2 weeks ago (1 children)

Even Ozy talked about someone converting zir to eugenics around the time that zi was moving from feminist blogging to LessWrong (although zi no longer endorsed eugenics two years later) https://web.archive.org/web/20141127052857/http://thingofthings.wordpress.com/2014/10/31/an-argument-against-eugenics-that-doesnt-involve-calling-anyone-a-nazi/

load more comments (1 replies)
load more comments (35 replies)
[–] Soyweiser@awful.systems 19 points 2 weeks ago (1 children)

Sneerclub, forever hated for being right to soon.

[–] swlabr@awful.systems 20 points 2 weeks ago

Cassandra club. We continue cassandering

load more comments
view more: next ›