this post was submitted on 15 Oct 2025
286 points (93.6% liked)

TechTakes

2255 readers
91 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hendrik@palaver.p3x.de 40 points 4 days ago* (last edited 4 days ago) (4 children)

Interesting. Why would more manipulative people and ones with more focus on self-interest use AI more than other people? Because they're more likely to take shortcuts while doing stuff? Or is there any other direct benefit for them?

[–] owenfromcanada@lemmy.ca 53 points 4 days ago (1 children)

I imagine there are a few reasons. An LLM is a narcissist's dream--it will remain focused on you and tell you what you want to hear (and is always willing to be corrected).

In addition, LLMs are easy to manipulate, and sort of mimic a person enough to give you a sense of power or authority. So if you're the type of person who gets something from that, there's likely a draw to that kind of person.

Those are just guesses, though. I don't use LLMs myself, so I don't really know.

[–] hendrik@palaver.p3x.de 10 points 4 days ago* (last edited 4 days ago)

Thanks, that sounds reasonable. Especially the focus/attention.

Maybe it's the same as with other games or computer games... Some people also really get something out of fantasy achievements and when they win and feel like the main character... in a weird way...

[–] V0ldek@awful.systems 18 points 4 days ago (1 children)

My completely PIDOOMA take is that if you're self-interested and manipulative you're already treating most if not all people as lesser, less savvy, less smart than you. So just the fact that you can half-ass shit with a bot and declare yourself an expert in everything that doesn't need such things like "collaboration with other people", ew, is like a shot of cocaine into your eyeball.

LLMs' tone is also very bootlicking, so if you're already narcissistic and you get a tool that tells you yes, you are just the smartest boi, well... To quote a classic, it must be like being repeatedly kicked in the head by a horse.

[–] dgerard@awful.systems 7 points 3 days ago

increasing number of social media responses which come across as they think they're giving clarifying orders to a chatbot