this post was submitted on 12 May 2025
609 points (98.9% liked)

Just Post

964 readers
207 users here now

Just post something ๐Ÿ’›

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] merc@sh.itjust.works 7 points 1 month ago

They're not even "stupid" though. It's more like if you somehow trained a parrot with every book ever written and every web page ever created and then had it riff on things.

But, even then, a parrot is a thinking being. It may not understand the words it's using, but it understands emotion to some extent, it understands "conversation" to a certain extent -- taking turns talking, etc. An LLM just predicts the word that should appear next statistically.

An LLM is nothing more than an incredibly sophisticated computer model designed to generate words in a way that fools humans into thinking those words have meaning. It's almost more like a lantern fish than a parrot.