this post was submitted on 25 Oct 2023
74 points (80.8% liked)

Technology

71665 readers
3267 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] fubo@lemmy.world 57 points 2 years ago* (last edited 2 years ago) (2 children)

Deepfakes of an actual child should be considered defamatory use of a person's image; but they aren't evidence of actual abuse the way real CSAM is.

Remember, the original point of the term "child sexual abuse material" was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse -- such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)

But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.

[–] Uranium3006@kbin.social 10 points 2 years ago (1 children)

Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

although that distinction lasted about a week before the same bad actors who cancel people over incest fanfic started calling all the latter CSEM too

[–] fubo@lemmy.world 11 points 2 years ago* (last edited 2 years ago) (1 children)

As a sometime fanfic writer, I do notice when fascists attack sites like AO3 under a pretense of "protecting children", yes.

[–] Uranium3006@kbin.social 6 points 2 years ago

And it's usually fascists, or at least people who may not consider themselves as such but think and act like fascists anyways.

[–] Pyro@pawb.social 2 points 2 years ago (2 children)

Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won't need "real"

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

[–] topinambour_rex@lemmy.world 4 points 2 years ago

Sadly, a lot of it does evolve from wanting to "watch" to wanting to do

Have you got some source about this ?

[–] fubo@lemmy.world 2 points 2 years ago (1 children)

Some actually fetishize causing suffering.

[–] JohnEdwa@sopuli.xyz 3 points 2 years ago* (last edited 2 years ago)

Some people are sadists and rapists, yes, regardless of what age group they'd want to do it with.