this post was submitted on 18 Mar 2026
309 points (97.0% liked)

Technology

82830 readers
3693 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] L3s@lemmy.world 6 points 12 hours ago

People can't be civil, locking.

[–] h54@programming.dev 230 points 1 day ago (6 children)

Isn't their slogan "shop like a billionaire?". This tracks.

[–] ZILtoid1991@lemmy.world 10 points 15 hours ago

Billionaires rape real children.

load more comments (4 replies)
[–] wrinkle2409@lemmy.cafe 80 points 1 day ago (34 children)

I don't understand this. They're dolls, they aren't alive. Why people would care? This may be controversial, but I'd rather have a pedophile fucking a doll than raping a child

[–] ulterno@programming.dev 6 points 15 hours ago

They are making these legislations to steer people's focus away from the real CSA.

Remember. CSAM is just the symptom. CSA being the actual cause.

[–] Iconoclast@feddit.uk 17 points 21 hours ago (3 children)

It's a moral panic - pure and simple. The same reason some countries want to ban cartoon/animated pictures where the fictional character looks too young. I guess the underlying assumption there is that it'll increase the number of people offending towards real children but I don't think there's any evidence to back that up.

If it was up to me, the criteria would be whether an actual person is being hurt directly or as a consequence of. That would include real violence, real pictures and possibly also GenAI stuff if it's trained on real content.

load more comments (3 replies)
[–] village604@adultswim.fan 74 points 1 day ago* (last edited 1 day ago) (18 children)

Exactly. Same with faux bait stuff. I personally think it's gross so I don't consume it, but if everyone is a consenting adult and it stops people from consuming real CSAM I can't really support banning it.

But the problem many people have with stuff like that is they assume the people consuming it will go on to do it to real people, which is the same argument they tried to use against violent video games.

load more comments (18 replies)
load more comments (31 replies)
load more comments
view more: next ›