this post was submitted on 15 May 2025
304 points (98.4% liked)
Technology
70268 readers
3757 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can we normalize not calling them hallucinations? They’re not hallucinations. They are fabrications; lies. We should not be romanticizing a robot lying to us.
Pretty engrained vocabulary at this point. Lies implies intent. I would have preferred "errors"
Also, for the record, this is the most dystopian headline I've come across to date.
If a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?
Bullshit.
This is what I've been calling it. Not as a pejorative term, just descriptive. It has no concept of truth or not-truth, it just tells good-sounding stories. It's just bullshitting. It's a bullshit engine.