this post was submitted on 21 Jul 2025
665 points (98.5% liked)

Technology

287 readers
435 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] SendMePhotos@lemmy.world 45 points 5 days ago (32 children)
[–] ech@lemmy.ca 80 points 5 days ago (29 children)

Both require intent, which these do not have.

[–] RedPandaRaider@feddit.org 0 points 4 days ago* (last edited 4 days ago) (7 children)

Lying does not require intent. All it requires is to know an objective truth and say something that contradicts or conceals it.

As far as any LLM is concerned, the data they're trained on and other data they're later fed is fact. Mimicking human behaviour such as lying still makes it lying.

[–] ech@lemmy.ca 8 points 4 days ago

Except these algorithms don't "know" anything. They convert the data input into a framework to generate (hopefully) sensible text from literal random noise. At no point in that process is knowledge used.

load more comments (6 replies)
load more comments (27 replies)
load more comments (29 replies)