this post was submitted on 10 Nov 2023
423 points (92.2% liked)

Technology

72440 readers
2344 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] autotldr@lemmings.world 11 points 2 years ago (1 children)

This is the best summary I could come up with:


"As a child psychiatrist, Tatum knew the damaging, long-lasting impact sexual exploitation has on the wellbeing of victimized children," said US Attorney Dena J.

The trial evidence cited by the government includes a secretly-made recording of a minor (a cousin) undressing and showering, and other videos of children participating in sex acts.

"Additionally, trial evidence also established that Tatum used AI to digitally alter clothed images of minors making them sexually explicit," prosecutors said.

"Specifically, trial evidence showed that Tatum used a web-based artificial intelligence application to alter images of clothed minors into child pornography."

In prepared remarks [PDF] delivered at a US Senate subcommittee hearing earlier this year, OpenAI CEO Sam Altman said, "GPT-4 is 82 percent less likely to respond to requests for disallowed content compared to GPT-3.5, and we use a robust combination of human and automated review processes to monitor for misuse.

A recent report from investigative organization Balkan Insight says groups like Thorn have been supporting CSAM detection legislation to make online content scanning compulsory in part because they provide that service.


The original article contains 457 words, the summary contains 177 words. Saved 61%. I'm a bot and I'm open source!

[–] Mojojojo1993@lemmy.world -5 points 2 years ago (2 children)

Would be good if we could use this for porn. It's better than anyone actually being in porn. If AI takes over then less people would be trafficked and in the porn industry

[–] UnculturedSwine@lemmy.world 7 points 2 years ago* (last edited 2 years ago) (2 children)

What would help with sex trafficking is legalizing and regulating prostitution and destigmatizing it.

[–] Mojojojo1993@lemmy.world 0 points 2 years ago

That doesn't work worldwide

[–] WuTang@lemmy.ninja -1 points 2 years ago* (last edited 2 years ago) (2 children)

Regulating prostitution?

Because you think it is a normal, thought life style? A girl having a normal education (school, not abused during her youth, abandoned, fled war/guerrilla), eating to her fill would choose this path?!

[–] HikingVet@lemmy.ca 1 points 2 years ago (1 children)

So, because they had a rough life and choose to sell sex, they shouldn't get workplace protections? Protection of the law?

Bet you think all drugs should be illegal as well.

[–] WuTang@lemmy.ninja 0 points 2 years ago (1 children)

Or maybe, providing them better opportunities in our societies than selling their bodies to a disgusting guy?!

Bet you think all drugs should be illegal as well.

i don't know what you are smoking right now but it's not helping you for dialog.

[–] HikingVet@lemmy.ca 2 points 2 years ago* (last edited 2 years ago)

Could be that some people actually choose to do the work and if it was regulated and destigmatised, opinions like yours would disappear.

[–] burchalka@lemmy.world 0 points 2 years ago

Yep, some hints are videos and stories of people asking the prostitute to clean their house or cook for them, only to be told to get lost...