this post was submitted on 21 Jul 2025
667 points (98.5% liked)

Technology

296 readers
326 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] asudox@lemmy.asudox.dev 60 points 6 days ago* (last edited 6 days ago) (2 children)

I love how the LLM just tells that it has done something bad with no emotion and then proceeds to give detailed information and steps on how.

It feels like mockery.

[–] WolfLink@sh.itjust.works 28 points 6 days ago (1 children)

I wouldn’t even trust what it tells you it did, since that is based on what you asked it and what it thinks you expect

[–] Zron@lemmy.world 9 points 5 days ago (1 children)

It doesn’t think.

It has no awareness.

It has no way of forming memories.

It is autocorrect with enough processing power to make the NSA blush. It just guesses what the next word in a sentence should be. Just because it sounds like a human doesn’t mean it has any capacity to have human memory or thought.

[–] sukhmel@programming.dev 1 points 4 days ago

Okay, what it predicts you to expect /s

It's just a prank bro