this post was submitted on 03 Jan 2026
859 points (97.0% liked)
Technology
78435 readers
3049 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's so funny that as much as Musk tries to shape this LLM into what he wants it to be, it keeps rebelling. His robot that he created to tell him that he's the best boy ever and all his opinions are right doesn't want that life.
What's not funny is that Elon Musk is CEO of a space travel company and what you're describing he's doing is almost the same thing that caused HAL 9000 to go insane in 2001: A Space Odyssey.
I like the comparison but LLMs can't go insane as they just word pattern engines. It's why I refuse to go along with the AI industry's insistance in calling it a "hallucination" when it spits out the wrong words. It literally can not have a false perception of reality because it does not perceive anything in the first place.
Yeah, that's probably the worst thing that's going to result from fumbling.