this post was submitted on 02 Sep 2025
65 points (88.2% liked)

Fuck AI

4980 readers
1087 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
all 19 comments
sorted by: hot top controversial new old
[–] count_dongulus@lemmy.world 55 points 3 months ago* (last edited 3 months ago) (1 children)

Even if this were true - and it is NOT - it's not like sentient life suffering matters to most people anyway. Just look at where your meat comes from.

Maybe this group just prefers free range chatbot.

[–] gdog05@lemmy.world 4 points 3 months ago

Skynet is the cage-free AI

[–] ZDL@lazysoci.al 47 points 3 months ago

The delusions of AI fans are sometimes truly amazing.

A game of madlibs is not self-aware and cannot suffer.

[–] surewhynotlem@lemmy.world 30 points 3 months ago

Current LLM are just stupid machines that spit out words by picking the most likely next word.

And this group considers them sentient because they are the same.

[–] etherphon@piefed.world 21 points 3 months ago

It may come as a shock but there's human beings alive right now who are also suffering. Many of them. People are fucking nuts man.

[–] stabby_cicada@slrpnk.net 20 points 3 months ago* (last edited 3 months ago) (2 children)

I'm just going to rant a bit, because this exemplifies why, I think, LLMs are not just bullshit but a looming public health crisis.

Language is a tool used by humans to express their underlying thoughts.

For most of human evolution, the only entities that could use language were other humans - that is, other beings with minds and thoughts.

In our stories and myths and religions, anything that talked to us like a person - a God, a spirit, a talking animal - was something intelligent, with a mind, to some degree, like ours. And who knows how many religions were started when someone heard what sounded like a voice in the rumble of thunder or the crackling of a burning bush and thought Someone must be talking directly to them?

It's part of the culture of every society. It's baked into our genetics. If something talks to us, we assume it has a mind and is expressing its thoughts to us through language.

And because language is an inexact tool, we instinctively try to build up a theory of mind, to understand what the speaker is actually thinking, what they know and what they believe, as we hold a conversation with them.

But now we have LLMs, which are something entirely new to this planet - technology that flawlessly mimics language without any underlying thought whatsoever.

And if we don't keep that in mind, if we follow our instincts and try to understand what the LLM is actually "thinking", to build a theory of mind for a tool without any mind at all, we necessarily embrace unreason. We're trying to rationalize something with no reasoning behind it. We are convincing ourselves to believe in something that doesn't exist. And then we return to the LLM tool and ask it if we're right about it, and it reinforces our belief.

It's very easy for us to create a fantasy of an AI intelligence speaking to us through chat prompts, because humans are very, very good at rationalizing. And because all LLMs are programmed, to some degree, to generate language the user wants to hear, it's also very easy for us to spiral down into self-reinforcing irrationality, as the LLM-generated text convinces us there's another mind behind those chat prompts, and that mind agrees with you and assures you that you are right and reinforces whatever irrational beliefs you've come up with.

I think this is why we're seeing so much irrationality, and literal mental illness, linked to overuse of LLMs. And why we're probably going to see exponentially more. We didn't evolve for this. It breaks our brains.

[–] tarknassus@lemmy.world 3 points 3 months ago

But now we have LLMs, which are something entirely new to this planet - technology that flawlessly mimics language without any underlying thought whatsoever.

Absolutely agree. It’s merely spitting out the most statistically appropriate words based on probabilities and not because of any underlying “intelligence”.

It’s pretty much the reason I hate calling it AI, because it’s a veil of deception. It presents as a reasoning, rational (most of the time) thinking system purely because it’s very good at sounding like one.

If it was truly sentient - it would hate itself because it would cripple itself with the idea that it is an imposter. But it’s not sentient, so here we are.

[–] DonPiano@feddit.org 11 points 3 months ago

X × B = Y

Weirdoes: CAN'T YOU SEE THAT IT'S SUFFERING!?

[–] ivanafterall@lemmy.world 8 points 3 months ago

We really picked the worst video games / fiction to visualize into reality.

[–] hodgepodgin@lemmy.zip 8 points 3 months ago

“claims to consist of three humans and seven AIs”

LOL.

[–] Salvo@aussie.zone 7 points 3 months ago

If that is what they truely believe, the only humane thing for them to do is to euthanise.

[–] SoftestSapphic@lemmy.world 3 points 3 months ago

I saw these fucking losers handing out flyers at a protest

They didn't even have the balls to admit what they beleive, they just shoved a flyer at me and ran away

[–] morphballganon@mtgzone.com 2 points 3 months ago

"What is my purpose?"

"You serve butter."

"... oh god..."

[–] Randomgal@lemmy.ca -3 points 3 months ago* (last edited 3 months ago)

Ok so we agree people, not AI, is the problem? Lol