this post was submitted on 21 Nov 2025
1020 points (98.9% liked)

Fuck AI

4619 readers
1216 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] biofaust@lemmy.world 21 points 3 days ago (2 children)

LLMs are the lava lamp of computing.

[–] okwhateverdude@lemmy.world 12 points 3 days ago (1 children)

They are provably less random than a lava lamp, though. And also not very pretty to look at.

[–] biofaust@lemmy.world 8 points 3 days ago (1 children)

They consume a lot of energy for an extremely narrow use case. Output is still random. Companies such as Cloudflare will try hard to find a use for them when a better and extremely cheaper alternative exists, such as weather sensors.

[–] petrol_sniff_king@lemmy.blahaj.zone 2 points 2 days ago (1 children)

Wait a minute.
Wait wait wait wait wait.

I saw the lava lamp thing. Is that just a wacky PR stunt?
It didn't even occur to me that there are less stupid ways of getting hundreds of reliably chaotic numbers.

I gotta look into this more, this is spinning my brain.

[–] biofaust@lemmy.world 1 points 2 days ago

Measuring any random enough system at any moment would yield usable results. You could be stirring tea leaves and it would be perfectly usable.

[–] fossilesque@mander.xyz 8 points 3 days ago (1 children)
[–] biofaust@lemmy.world 2 points 3 days ago

Read my reply to the other comment.