ganryuu

joined 2 months ago
[–] ganryuu@lemmy.ca 3 points 1 month ago (1 children)

Union of Soviet Socialist Americas?

[–] ganryuu@lemmy.ca 9 points 1 month ago

Desire to know more intensifies

[–] ganryuu@lemmy.ca 5 points 1 month ago (1 children)

I mean, I didn't see anyone suggesting helldivers are a sentient species

[–] ganryuu@lemmy.ca 2 points 1 month ago (1 children)
[–] ganryuu@lemmy.ca 2 points 1 month ago

I don't believe in the "whoops we made a mistake, we don't know what the average Canadian actually wants", their business is in knowing what we want and how to sell it to us. That checklist you made as an example? Wouldn't let them control exactly what they want to sell.

I know that when something can be attributed for both malice and incompetence we should most often choose the latter, but I believe it less and less when it comes to corporations (as opposed to fallible people). Also some products which have nothing at all to do with Canada have been labeled with that maple leaf. It's not a question of "which part of this do they want to be Canadian again?" when nothing about the product is.

In the end, choosing to not fine the corpos is simply a message telling them to continue with the misleading labels as it allows them to better control what they want to sell without any fear of repercussions.

[–] ganryuu@lemmy.ca 2 points 1 month ago

Very fair. Thank you!

[–] ganryuu@lemmy.ca 3 points 1 month ago (2 children)

I agree with the part about unintended use, yes an LLM is not and should never act as a therapist. However, concerning your example with search engines, they will catch the suicide keyword and put help sources before any search result. Google does it, DDG also. I believe ChatGPT will start with such resources also on the first mention, but as OpenAI themselves say, the safety features degrade with the length of the conversation.

About this specific case, I need to find out more, but other comments on this thread say that not only the kid was in therapy, suggesting that the parents were not passive about it, but also that ChatGPT actually encouraged the kid to hide what he was going through. Considering what I was able to hide from my parents when I was a teenager, without such a tool available, I can only imagine how much harder it would be to notice the depth of what this kid was going through.

In the end I strongly believe that the company should put much stronger safety features, and if they are unable to do so correctly, then my belief is that the product should just not be available to the public. People will misuse tools, especially a tool touted as AI when it is actually a glorified autocomplete.

(Yes, I know that AI is a much larger term that also encompasses LLMs, but the actual limitations of LLMs are not well enough known by the public, and not communicated enough by the companies to the end users)

[–] ganryuu@lemmy.ca 6 points 1 month ago

I'm honestly at a loss here, I didn't intend to argue in bad faith, so I don't see how I moved any goal post

[–] ganryuu@lemmy.ca 5 points 1 month ago

Can't argue there...

[–] ganryuu@lemmy.ca 13 points 1 month ago (8 children)

That seems way more like an argument against LLMs in general, don't you think? If you cannot make it so it doesn't encourage you to suicide without ruining other uses, maybe it wasn't ready for general use?

[–] ganryuu@lemmy.ca 3 points 1 month ago* (last edited 1 month ago) (1 children)

The other comments in this thread (almost) all talk about any amount of spending being useless in the face of the extreme might of the US army, so I'm curious how you see more spending as being ok? Genuine question, not trying to attack you or anything.

[–] ganryuu@lemmy.ca 3 points 1 month ago* (last edited 1 month ago)

I got an edit that you may have not seen. Just wanted to point that out.

Also, attacking my character with all that "too much time on the internet" is not the killer argument you seem to think it is.

Funny how I got this extra information with 1 online search, which you seem quite intent on avoiding.

view more: ‹ prev next ›