this post was submitted on 16 Jul 2025
123 points (80.6% liked)
Technology
72932 readers
2769 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They’re not robots. They have no self awareness. They have no awareness period. WTF even is this article?
If a program is given a set of instructions, it should produce that set of instructions.
If a program not only does not produce those instructions, but gives itself its own set of instructions, and the programmers don't understand what's actually happening, that may be cause for concern.
"Self aware" or not. (I'm sure an ai would pass the mirror test)
People seem to have no problem with the term machine learning. Or the intelligence in ai. We seem to be unwilling to consider a consciousness that is not anthrocentric. Drawing that big red line with semantics we create. It can learn. It can defend itself. It can manipulate and cause users harm. It wants to survive.
Sometimes we need to create new words or definition to explain new things.
Remember when animals were not conscious beings just driven by instinct or whatever we told ourselves to make us feel better?
Is a bee self aware? Is it conscious? Does it eat, learn, defend, attack? Does it matter what we say it is or isn't?
There are humans we say have co conscience.
Maybe ai is just the sum of human psychopathy / psychosis.
Either way, semantics are semantics, and we ourselves might just be simulations in a holographic universe.
It's a goddamn stochastic parrot, starting from zero on each invocation and spitting out something passing for coherence according to its training set.
"Not understanding what is happening" in regards to AI is NOT "we don't jniw how it works mechanically" it's "yeah there are so many parameters, it's just not possible to make sense of / keep track of them all".
There's no awareness or thought.
There may be thought in a sense.
A analogy might be a static biological “brain” custom grown to predict a list of possible next words in a block of text. It’s thinking, sorta. Maybe it could acknowledge itself in a mirror. That doesn’t mean it’s self aware, though: It’s an unchanging organ.
And if one wants to go down the rabbit hole of “well there are different types of sentience, lines blur,” yada yada, with the end point of that being to treat things like they are…
All ML models are static tools.
For now.