this post was submitted on 21 Sep 2025
188 points (98.5% liked)

Technology

4313 readers
349 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] BombOmOm@lemmy.world 13 points 1 week ago* (last edited 1 week ago) (1 children)

A hallucination is something that disagrees with your active inputs (ears, eyes, etc). AIs don't have these active inputs, all they have is the human equivalent of memories. Everything they draw up is a hallucination, literally all of it. It's simply coincidence when a hallucination matches reality.

Is it really surprising that the thing that can only create hallucinations is often wrong? That the thing that can only create hallucinations will continue to be wrong on a regular basis in the future as well?

My guy, Microsoft Encarta 97 doesn't have senses either, and its recollection of the capital of Austria is neither coincidence nor hallucination.