this post was submitted on 16 Jul 2025
126 points (80.9% liked)

Technology

73066 readers
2149 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] smiletolerantly@awful.systems 21 points 1 week ago (1 children)

It's a goddamn stochastic parrot, starting from zero on each invocation and spitting out something passing for coherence according to its training set.

"Not understanding what is happening" in regards to AI is NOT "we don't jniw how it works mechanically" it's "yeah there are so many parameters, it's just not possible to make sense of / keep track of them all".

There's no awareness or thought.

[–] brucethemoose@lemmy.world 3 points 1 week ago* (last edited 1 week ago)

There may be thought in a sense.

A analogy might be a static biological “brain” custom grown to predict a list of possible next words in a block of text. It’s thinking, sorta. Maybe it could acknowledge itself in a mirror. That doesn’t mean it’s self aware, though: It’s an unchanging organ.

And if one wants to go down the rabbit hole of “well there are different types of sentience, lines blur,” yada yada, with the end point of that being to treat things like they are…

All ML models are static tools.

For now.