this post was submitted on 07 Mar 2024
269 points (91.9% liked)

Memes

1481 readers
1 users here now

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Omega_Haxors@lemmy.ml 16 points 1 year ago* (last edited 1 year ago) (21 children)

Oh no we are NOT doing this shit again. It's literally autocomplete brought to its logical conclusion, don't bring your stupid sophistry into this.

[–] UraniumBlazer@lemm.ee 1 points 1 year ago (9 children)

Your brain is just a biological system that works somewhat like a neural net. So according to your statement, you too are nothing more than an auto complete machine.

[–] Omega_Haxors@lemmy.ml 0 points 1 year ago* (last edited 1 year ago) (5 children)

I'm starting to wonder if any of you even know how that shit even works internally, or if you just take what the hype media says at face value. It literally has one purpose and one purpose alone: Determine what the next word is going to be by calculating the probability which word will come after the next. That's it. All it does is try to string a convincing sentence using probabilities. It does not and cannot understand context.

The underlying tech is really cool but a lot of people are grotesquely overselling its capabilities. Not to say a neural network can't eventually obtain consciousness (because ultimately our brains are a union of a bunch of little neural networks working together for a common goal) but it sure as hell isn't going to be an LLM. That's what I meant by sophistry, they're not engaging with the facts, just some nebulous ideal.

[–] alphafalcon@feddit.de 2 points 1 year ago

I'm with you on LLMs being over hyped although that's already dying down a bit. But regarding your claim that LLMs cannot "understand context", I've recently read an article that shows that LLMs can have an internal world model:

https://thegradient.pub/othello/

Depending on your definition of "understanding" that seems to be an indicator of being more than a pure "stochastic parrot"

load more comments (4 replies)
load more comments (7 replies)
load more comments (18 replies)