this post was submitted on 15 Jun 2025
14 points (100.0% liked)

TechTakes

1999 readers
158 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

(page 2) 50 comments
sorted by: hot top controversial new old
[–] swlabr@awful.systems 9 points 1 week ago

Doing some reading about the SAG-AFTRA video game voice acting strike. Anyone have details about "Ethovox", the AI company that SAG has apparently partnered with?

[–] fullsquare@awful.systems 9 points 1 week ago (1 children)
[–] o7___o7@awful.systems 4 points 1 week ago

This guy is a real self-licking ice cream cone (flavor: pralines and dick)

[–] sailor_sega_saturn@awful.systems 9 points 1 week ago* (last edited 1 week ago) (1 children)

More network state nonsense is afoot: https://frontiervalley.com/ https://www.thenerdreich.com/startup-seeks-trump-ai-emergency-for-california-tech-city/

They (who?) have publicly drafted an executive order because they want to take over the Alameda Naval Air Station (a superfund site).

Edit: Per the twitter account the weirdo behind this is James Ingallinera.

[–] swlabr@awful.systems 8 points 1 week ago (8 children)

network state

Great, a new stupid thing to know about. How likely is it that a bunch of people that believe they are citizens of an online state will become yet another player in the Stochastic Terrorism as a Service industry?

load more comments (8 replies)

Easy Money Author (and former TV Star) Ben Mckenzie's new cryptoskeptic documentary is struggling to find a distributor. Admittedly, the linked article is more a review of the film than a look at the distributor angle. Still, it looks like it's telling the true story in a way that will hopefully connect with people, and it would be a real shame if it didn't find an audience.

[–] BlueMonday1984@awful.systems 8 points 1 week ago (1 children)

Finally circling back around to this.

Feels like I am not just doing my job but also the work the operator of the service or product I am having to use through chat should have paid professionals to do. And I’m not getting paid for it.

Speaking as someone who has worked extensively in IT support, I think that's the sales pitch for these chatbots. They don't want to give users tools and knowledge to solve their own problems - or rather they do but the chatbots aren't part of that. The chatbots are supposed to replace the people who would interact with the relevant systems on your behalf. And honestly, working with a support person is already a deeply unsatisfying interaction in the vast majority of cases. In even the best case scenario it involves acknowledging that some part of your job has exceeded your ability and you need specialized help, and handling that well is a very rare personality trait. But the massive variety of interconnected systems that we rely on are too complex for this to not be a common occurrence. Even if you did radically open everything from internal bug trackers to licensing systems to communications there wouldn't be enough time in the day for everyone to learn those systems well enough to perfectly self-solve all their problems, and that lack of systems knowledge would be a massive drain on your operations. But trying to fit in an LLM chatbot is the worst of both worlds, in that your users are both locked away from the tools and knowledge that would let them solve their own issues but still need to learn how to wrangle your intermediary system, and that system doesn't have the human ability to connect and build a working relationship and get through those issues in a positive way.

[–] fasterandworse@awful.systems 7 points 1 week ago

new rant from me about how boosters asking critics to admit that AI is "useful" is not the win they think it is vid: https://www.youtube.com/watch?v=bRcBCji6XvE audio: https://pnc.st/s/faster-and-worse/94cb1cda/useful-is-nothing

[–] Soyweiser@awful.systems 7 points 1 week ago (6 children)

Weird conspiracy theory musing: So we know Rokos Basilisk only works on a very specific type of person who needs to belief in all the LW stuff about what the AGI future will be like, but who also feel morally responsible, and have high empathy. (Else the thing falls apart, you need to care about, feel responsible for, and believe the copies/simulated things are conscious). We know caring about others/empathy is one of those traits which seem to be rarer on the right than the left, and there is a feeling that a lot of the right is doing a war on empathy (see the things Musk has said, the whole chan culture shit, but also themotte which somebody once called an 'empathy removal training center' which stuck so I also call it that. If you are inside once of these pipelines you can also notice it, or if you get out, you can see it looking back, I certainly did when I read more LW/SSC stuff). We also know Roko is a bit of a chud, who wants some sort of 'transhumanist' 'utopia' where nobody is non-white or has blue hair (I assume this is known, but if you care to know more about Roko (why?) search sneerclub (Ok, one source as a treat)).

So here is my conspiracy theory. Roko knew what he was doing, it was intentional on Rokos part, he wanted to drive the empathic part of LW mad, discredit them. (That he was apparently banned from several events for sexual harassment also is interesting. Does remind me of another 'lower empathy' thing the whole manosphere/pua thing which was a part of early LW, which often trains people to think less of women).

Note that I don't believe in this, as there is no proof for it, I don't think Roko planned for this (nor considered it in any way) and I think his post was just a honest thought experiment (as was Yuds reaction). It was just an annoying thought which I had to type up else I keep thinking about it. Sorry to make it everybodies problem.

[–] aio@awful.systems 9 points 1 week ago (7 children)

I thought part of the schtick is that according to the rationalist theory of mind, a simulated version of you suffering is exactly the same as the real you suffering. This relies on their various other philosophical claims about the nature of consciousness, but if you believe this then empathy doesn't have to be a concern.

load more comments (7 replies)
[–] saucerwizard@awful.systems 6 points 1 week ago

iirc he has a lawyer on retainer in case of another sexual harassment claim.

[–] BlueMonday1984@awful.systems 5 points 1 week ago

...Honestly, I can't help but feel you're on to something. I'd have loved to believe this was an honest thought experiment, but after seeing the right openly wage a war on empathy as a concept, I wouldn't be shocked if Roko's Basilisk (and its subsequent effects) weren't planned from the start.

load more comments (3 replies)
[–] BlueMonday1984@awful.systems 4 points 1 week ago* (last edited 1 week ago)

ZITRON DROPPED (sadly, its premium)

load more comments
view more: ‹ prev next ›