this post was submitted on 18 Jul 2025
247 points (94.9% liked)

Technology

73035 readers
2989 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] match@pawb.social 9 points 3 days ago (4 children)

isn't this just paranoid schizophrenia? i don't think chatgpt can cause that

[–] Alphane_Moon@lemmy.world 18 points 3 days ago (2 children)

I have no professional skills in this area, but I would speculate that the fellow was already predisposed to schizophrenia and the LLM just triggered it (can happen with other things too like psychedelic drugs).

[–] SkaveRat@discuss.tchncs.de 5 points 2 days ago

I'd say it either triggered by itself or potentially drugs triggered it, and then started using an LLM and found all the patterns to feed that shizophrenic paranoia. it's avery self reinforcing loop

[–] zzx@lemmy.world 2 points 3 days ago

Yup. LLMs aren't making people crazy, but they are making crazy people worse

[–] leftzero@lemmy.dbzer0.com 5 points 2 days ago

LLMs are obligate yes-men.

They'll support and reinforce whatever rambling or delusion you talk to them about, and provide “evidence” to support it (made up evidence, of course, but if you're already down the rabbit hole you'll buy it).

And they'll keep doing that as long as you let them, since they're designed to keep you engaged (and paying).

They're extremely dangerous for anyone with the slightest addictive, delusional, suggestible, or paranoid tendencies, and should be regulated as such (but won't).

[–] nimble@lemmy.blahaj.zone 7 points 2 days ago

LLMs hallucinate and are generally willing to go down rabbit holes. so if you have some crazy theory then you're more likely to get a false positive from a chatgpt.

So i think it just exacerbates things more than alternatives

[–] Skydancer@pawb.social 2 points 2 days ago (1 children)

Could be. I've also seen similar delusions in people with syphilis that went un- or under-treated.

[–] ScoffingLizard@lemmy.dbzer0.com 1 points 21 hours ago (1 children)

Where tf are people not treated for syphilis?

[–] Skydancer@pawb.social 1 points 16 hours ago

In this case, the United States. When healthcare is expensive and hard to access, not everybody gets it.

Syphilis symptoms can be so mild they go unnoticed. When you combine that with risky sexual behavior (hook-up culture, anti-condom bias) and lack of testing due to inadequate medical care, you can wind up with untreated syphilis. If you become homeless, care gets even harder to access.

You get diagnosed at a late stage when treatment is more difficult. They put you on a treatment plan, but followup depends on reliable transportation and the mental effects of the disease have made you paranoid. Now imagine you're also a member of a minority on which medical experiments have historically been done without consent or notice.

You don't really trust that those pills are for what you've been told at all. So difficulty accessing healthcare, changing clinics as you move around with medical history not always keeping up, distrust of the providers and treatment, and general instability and lack of regular routine all add up to only taking your medication inconsistently.

Result: under-treated syphilis