hey so I got this letter from OSHA saying you’re no longer qualified to post here? please step away from the forklift
no it isn’t, your posts are still shit
this is the happy honeymoon phase?
this marriage is fucked
Yudkowsky got wind of his skepticism, and reached out to Ross to do a discussion with him about the topic. He also requested that Ross not do any research on him.
I pinky promise I’m an expert! no you’re not allowed to check my credentials, the fuck?
you know, even knowing who and what Altman really is, that “politically homeless” tweet really is shockingly fascist. it’s got all my favorites!
- nationalism in every paragraph
- large capitalism will make me rich, and so can you!
- small government (but only the parts that Sam doesn’t like)
- we can return to a fictional, bright past
so countdown until Altman goes full-throated MAGA and in spite of how choreographed and obvious it is, it somehow still comes to a surprise to the people in our industry desperately clinging to the idea that software can’t be political
Alternately I guess I could like, ask for an instance ban or something, if that doesnt make the instance un-viewable from my account
hey no problem, we’ve got systems in place for this kind of thing. happy trails.
(though for the record, re the idea that right-wing posters are allowed in here without being told to go fuck themselves: lol)
no problem! I don’t mean to give you homework, just threads to read that might be of interest.
yeah, a few of us are Philosophy Tube fans, and I remember they’ve done a couple of good videos about parts of TESCREAL — their Effective Altruism and AI videos specifically come to mind.
if you’re familiar with Behind the Bastards, they’ve done a few videos I can recommend dissecting TESCREAL topics too:
- their episodes about the Zizians are definitely worth a listen; they explore and critique the group as a cult offshoot of LessWrong Rationalism.
- they did a couple of older videos on AI cults and their origins that are very good too.
also fair enough. you might still enjoy a scroll through our back archive of threads if you’ve got time for it — there is a historical context to transhumanism that people like Musk exploit to further their own goals, and that’s definitely something to be aware of, especially as TESCREAL elements gain overt political power. there are positive versions of transhumanism and the article calls one of them out — the Culture is effectively a model for socialist transhumanism — but one must be familiar with the historical baggage of the philosophy or risk giving cover to people currently looking to cause harm under transhumanism’s name.
let them eat prompts
fair enough!
but I don’t actually enjoy arguing and don’t have the skills for formalized “debate” anyway.
it’s ok, nobody does. that’s why we ban it unless it’s amusing (which effectively bans debate for everyone unless they know their audience well enough to not fuck up) — shitty debatelords take up a lot of thread space and mental energy and give essentially nothing back.
wherever “here” is
SneerClub is a fairly old community if you count in its Reddit origins; part of what we do here is sneering at technofascists and other adherents to the TESCREAL belief package, though SneerClub itself tends to focus on the LessWrong Rationalists. that’s the context we tend to apply to articles like the OP.
There is a certain irony to everyone involved in this argument, if it can be called that.
don’t do this debatefan here crap here, thanks
This, and similar writing I’ve seen, seems to make a fundamental mistake in treating time like only the next few, decades maybe, exist, that any objective that takes longer than that is impossible and not even worth trying, and that any problem that emerges after a longer period of time may be ignored.
this isn’t the article you’re thinking of. this article is about Silicon Valley technofascists making promises rooted in Golden Age science fiction as a manipulation tactic. at no point does the article state that, uh, long-term objectives aren’t worth trying because they’d take a long time??? and you had to ignore a lot of the text of the article, including a brief exploration of the techno-optimists and their fascist ties (and contrasting cases where futurism specifically isn’t fascist-adjacent), to come to the wrong conclusion about what the article’s about.
unless you think the debunked physics and unrealistic crap in Golden Age science fiction will come true if only we wish long and hard enough in which case, aw, precious, this article is about you!
no fucking thanks