this post was submitted on 09 Jul 2025
492 points (91.8% liked)

Science Memes

15678 readers
2423 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Scubus@sh.itjust.works 11 points 1 day ago (1 children)

... so the article should focus on stopping the users from doing that? There is a lot to hate AI companies for but their tool being useful is actually the bottom of that list

[–] Samskara@sh.itjust.works 4 points 22 hours ago* (last edited 22 hours ago) (1 children)

People in distress will talk to an LLM instead of calling a suicide hotline. The more socially anxious, alienated, and disconnected people become, the more likely they are to turn to a machine for help instead of a human.

[–] Scubus@sh.itjust.works 3 points 20 hours ago (1 children)

Ok, people will turn to google when they're depressed. I just googled a couple months ago the least painful way to commit suicide. Google gave me the info I was looking for. Should I be mad at them?

[–] Samskara@sh.itjust.works 1 points 20 hours ago (1 children)

You are ignoring that people are already developing personal emotional reaction with chatbots. That’s no the case with search bars.

The first line above the search results at google for queries like that is a suicide hotline phone number.

A chatbot should provide at least that as well.

I’m not saying it shouldn’t provide no information.

[–] Scubus@sh.itjust.works 0 points 19 hours ago (1 children)

Ok, then we are in agreement. That is a good idea.

I think that at low levels the tech should not be hindered because a subset of users use the tool improperly. There is a line, however, but im not sure where it is. If that problem were to become as widespread as, say, gun violence, then i would agree that the utility of the tool may need to be effected to curb the negative influence

[–] Samskara@sh.itjust.works 3 points 19 hours ago (1 children)

It’s about providing some safety measures to protect the most vulnerable. They need to be thrown a lifeline and an exit sign on their way down.

For gun purchases, these can be waiting periods of a few days. So you don’t buy a gun in anger and kill someone, regretting it immediately and ruining many people’s lives.

Did you have to turn off safe search to find methods for suicide?

[–] Scubus@sh.itjust.works 0 points 19 hours ago (1 children)

I do not recall, although if i did it clearly wasnt much of a hindrance. We do seem to be in agreement on this, although i have a tangentially related question for you. Do you believe suicide should be a human right?

[–] Samskara@sh.itjust.works 1 points 19 hours ago* (last edited 19 hours ago) (1 children)

People will always find a way to kill themselves. Lots of ways to kill yourself with things in your own house.

Punishing people for attempting or committing suicide is pointless. People shouldn’t be encouraged to commit suicide. For most people the desire to do that will pass and they will find joy again in their lives. Suicide doesn’t only affect the person who dies, the people who knew them are affected mainly. The ones who lose a loved person and of course those who have to clean up the mess left behind.

Assisted suicide is a bit more complicated. People might be pressured into suicide by their family members or society so they are no longer a burden. The worst version of this is commercially available assisted suicide that makes a profit. Imagine literal „kill yourself“ advertisements offering services where they get rich the more people off themselves. Chatbots messaging depressed folks and nudging them towards suicide. There have been cults that committed ritualistic mass suicides. I don’t think these are good for society. So there needs to be pretty strict regulations around this.

A friend of mine wanted to kill himself. What stopped him was the idea that if you have nothing to live for, find something worth dying for. He’s now an adventurer and explorer in extreme environments. For a while he also considered joining the Ukrainian foreign legion. A glorious heroic death doing something worthwhile is not the worst idea. If you don’t die, you will feel more alive than ever.

[–] Scubus@sh.itjust.works 1 points 14 hours ago

Thats a super cool outlook! Props to him for coming up with it. I really appreciate the response, i like your insights. I pretty much agree with all of that. There is another form though, people that have struggled with suicide their entire life, and for immutable reasons will continue to stuggle with until they die. For those people, there should be a humane path. But filtering out those from the temporarily depressed seems a gargantuan feat.