this post was submitted on 23 Jun 2025
7 points (73.3% liked)

Generative AI

174 readers
1 users here now

A community about news, questions, development and all related subjects of Generative AI

founded 2 years ago
MODERATORS
 

In November 2021, in the city of Chandler, Arizona, Chris Pelkey was shot and killed by Gabriel Horcasitas in a road rage altercation.

Horcasitas was tried and convicted of reckless manslaughter.

When it was time for Horcasitas to be sentenced by a judge, Pelkey’s family knew they wanted to make a statement – known as a “victim impact statement” – explaining to the judge who Pelkey had been when he was alive.

They found they couldn’t get the words right.

The solution for them turned out to be having Pelkey speak for himself by creating an AI-generated avatar that used his face and voice, allowing him to “talk” directly to the judge.

[...]

top 3 comments
sorted by: hot top controversial new old
[–] spankmonkey@lemmy.world 20 points 3 weeks ago (1 children)

The article authors are equating an avatar speaking what family thinks the victim would say with the victim speaking for himself. That is not the same thing and the judge should not have allowed it.

[–] Madison420@lemmy.world 1 points 2 weeks ago

It'll probably get tossed but to find if it's legal someone actually has to try it no matter how dumb it may be.

[–] Fourth@mander.xyz 6 points 2 weeks ago

Uh, this seems hella bad.