this post was submitted on 08 Oct 2023
2 points (100.0% liked)

And Finally...

1542 readers
1 users here now

A place for odd or quirky world news stories.

Elsewhere in the Fediverse:

Rules:

founded 2 years ago
MODERATORS
 

cross-posted from: https://aussie.zone/post/2798829

According to prosecutors, Chail sent “thousands” of sexual messages to the chatbot, which was called Sarai on the Replika platform. Replika is a popular AI companion app that advertised itself as primarily being for erotic roleplay before eventually removing that feature and launching a separate app called Blush for that purpose. In chat messages seen by the court, Chail told the chatbot “I’m an assassin,” to which it replied, “I’m impressed.” When Chail asked the chatbot if it thought he could pull off his plan “even if [the queen] is at Windsor,” it replied, “smiles yes, you can do it.”

you are viewing a single comment's thread
view the rest of the comments
[–] CookieJarObserver@sh.itjust.works 0 points 2 years ago (3 children)

Bro how the fuck can you have that much lack of brainpower?

[–] Emperor@feddit.uk 0 points 2 years ago (2 children)

The court had heard how Chail had a “significant history of trauma” and experienced psychotic episodes.

But the case raises concerns over how people with mental illnesses or other issues interact with AI chatbots that may lack guardrails to prevent inappropriate interactions.

[–] CookieJarObserver@sh.itjust.works 0 points 2 years ago (1 children)

So mental illness... :/ thats sad

[–] Emperor@feddit.uk 1 points 2 years ago

Exacerbated by unsafe AIs. At least back in the day we had to use our imagination to get encouragement from our dogs, Jodie Foster or air looms.