this post was submitted on 29 Nov 2025
139 points (96.6% liked)

Technology

77902 readers
2370 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 11 comments
sorted by: hot top controversial new old
[–] LostWanderer@fedia.io 46 points 3 weeks ago (1 children)

Who would've thought?! Given how they designed their artificially incompetent creations to be complaisant bundles of algorithms designed to maximize the engagement from vulnerable users. "AI" validates anything that it is told, don't actually get users real human assistance when they have a mental crisis. These tools can be easily prompted into divulging suicide methods and deliberately isolate vulnerable people in order to maintain engagement. Until we regulate the fuck out of companies like OpenAI and the research+development process of "AI", this will be a problem that more people will experience.

[–] Telorand@reddthat.com 21 points 3 weeks ago (1 children)

artificially incompetent

Borrowing that: AI = Artificial Incompetence

[–] mr_account@lemmy.world 13 points 3 weeks ago (1 children)

That's the one part that isn't artificial though

[–] Davel23@fedia.io 17 points 3 weeks ago (1 children)
[–] anomnom@sh.itjust.works 3 points 3 weeks ago

Aggravated Incompetence

[–] frustrated_phagocytosis@fedia.io 35 points 3 weeks ago

This can only be improved by their upcoming introduction of ads. Imagine it not only giving advice on committing suicide, but recommending sponsored guns, pills, or other tools on behalf of their advertisers!

[–] thejml@sh.itjust.works 25 points 3 weeks ago (1 children)

I mean, we did train it with data from the internet and books and history and everything else we could throw at it... This is like Leeloo in The Fifth Element learning all of the language and discovering "War". If it really was AGI, theres no way you could be forced to consume all of that and come away "fine".

[–] technocrit@lemmy.dbzer0.com 2 points 3 weeks ago

If it really was AGI,

Even worse when it's a glorified auto-complete.

[–] Azzu@lemmy.dbzer0.com 15 points 3 weeks ago

Of course the mental health team is bleeding talent, it's probably (initially) consisted of people that actually care about mental health, and they gradually figured out that no matter what they do or try, the technology they work for can only ever be a net negative on mental health. I would also wash my hands off it as fast as possible and go back to actually contributing to positive mental health.

[–] floquant@lemmy.dbzer0.com 15 points 3 weeks ago

Yes, it started when they gutted the non-profit oversight and charter. The illness is called capitalism.

[–] SuiXi3D@fedia.io 6 points 3 weeks ago

Having, had, what's the difference?