Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
view the rest of the comments
https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
I'd suggest that anyone who cares about the issue take the time to read the actual report, not just drama-oriented news articles about it.
So if I'm understanding right, based on their recommendations this will all be addressed as more moderation and QOL tools are introduced as we move further down the development roadmap?
If I can try to summarize the main findings:
Problem #2 can hopefully be improved with better tooling. I don't know what you do about problem #1, though.
One option would be to decide that the underlying point of removing real CSAM is to avoid victimizing real children; and that computer-generated images are no more relevant to this goal than Harry/Draco slash fiction is.
And are you able to offer any evidence to reassure us that simulated child pornography doesn't increase the risk to real children as pedophiles become normalised to the content and escalate (you know, like what already routinely happens with regular pornography)?
Or are we just supposed to sacrifice children to your gut feeling?
Would you extend the same evidence-free argument to fictional stories, e.g. the Harry/Draco slash fiction that I mentioned?
For what it's worth, your comment has already caused ten murders. I don't have to offer evidence, just as you don't. I don't know where those murders happened, or who was murdered, but it was clearly the result of your comment. Why are you such a terrible person as to post something that causes murder?
I have no problem saying that writing stories about two children having gay sex is pretty fucked in the head, along with anyone who forms a community around sharing and creating it.
But it's also not inherently abuse, nor is it indistinguishable from reality.
You're advocating that people just be cool with photo-realistic images of children, of any age, being raped, by any number of people, in any possible way, with no assurances that the images are genuinely "fake" or that pedophiles won't be driven to make it a reality, despite other pedophiles cheering them on.
I was a teenage contrarian psuedo-intellectual once upon a time too, but I never sold out other peoples children for something to jerk off too.
If you want us to believe its harmless, prove it.
You keep making up weird, defamatory accusations. Please stop. This isn't acceptable behavior here.
Awful pearl-clutchy for someone advocating for increased community support for photorealistic images of children being raped.
Which do you think is more acceptable to Lemmy in general? Someone saying "fuck", or communities dedicated to photorealistic images of children being raped?
Maybe I'm not the one who should be changing their behavior.
FYI: You've now escalated to making knowingly-false accusations about a specific person.
Quality play. That'll absolutely convince people that you're the good guy if they somehow find themselves at the bottom of this thread without having read your wet mouthed defense of the sharing of photorealistic images of children being raped.
You've got a lot of dark, lewd fantasies about other people. Please don't post them here. I'm sure there's an NSFW community where you can post them instead; but you do not have my consent to post your dark, lewd fantasies about me and what sort of person you fantasize that I am. You've already crossed over that line.
Please stop.
I'm asking you specifically: stop inventing dark, lewd fantasies about me and posting them here.
Only dark, lewd fantasies about the sexual assualt of their kids right?
That seems to be your thing, but it's still off-topic here, and writing me as a character into your weird fantasies is still nonconsensual.
FYI: You've now escalated to making knowingly-false accusations about a specific person.
On the contrary, I accurately reported my impression and my refusal of consent for you to post any more fiction about me.
Please stop.
Don't worry, it's AI generated.
Such a signal exists in the ActivityPub protocol, so I wonder why it's not being used.
Well the simple answer is that it doesn't have to be illegal to remove it.
The legal question is a lot harder, considering AI image generation has reached levels that are almost indistinguishable from reality.
Stopped reading.
Child abuse laws "exclude anime" for the same reason animal cruelty laws "exclude lettuce." Drawings are not children.
Drawings are not real.
Half the goddamn point of saying CSAM instead of CP is to make clear that Bart Simpson doesn't count. Bart Simpson is not real. It is fundamentally impossible to violate Bart Simpson's rights, because he doesn't fucking exist. There is nothing to protect him from. He cannot be harmed. He is imaginary.
This cannot be a controversial statement. Anyone who can't distinguish fiction from real life has brain problems.
You can't rape someone in MS Paint. Songs about murder don't leave a body. If you write about robbing Fort Knox, the gold is still there. We're not about to arrest Mads Mikkelsen for eating people. It did not happen. It was not real.
If you still want to get mad at people for jerking off to the wrong fantasies, that is an entirely different problem from photographs of child rape.
You should keep reading then, because they cover that later.
What does that even mean?
There's nothing to "cover." They're talking about illustrations of bad things, alongside actual photographic evidence of actual bad things actually happening. Nothing can excuse that.
No shit they are also discussing actual CSAM alongside... drawings. That is the problem. That's what they did wrong.
Oh, wait, Japanese in the other comment, now I get it. This conversation is a about AI Loli porn.
Pfft, of course, that's why no one is saying the words they mean, because it suddenly becomes much harder to take the stance since hatred towards Loli Porn is not universal.
Okay, thanks for the clarification
Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I'm not sure your point changed anything.
If you don't think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don't care about your opinion of anything.
You can be against both. Don't ever pretend they're the same.
Step up the reading comprehension please
I understand what you're saying and I'm calling you a liar.
You mean to say I'm wrong or you actually mean liar?
'Everyone but you agrees with me!' Bullshit.
'Nobody wants this stuff that whole servers exist for.' Self-defeating bullshit.
'You just don't understand.' Not an argument.
Okay, the former then.
Let's just think about it, how do you think it would turn out if you went outside and asked anyone about pornographic drawings of children? How long until you find someone who thinks like you outside your internet bubble?
"Nobody wants this stuff that whole servers..."
There are also servers dedicated to real child porn with real children too. Do you think that argument has any value with that tidbit of information tacked onto it?
Ask a stranger about anything pornographic and see how it goes.
This is rapidly going from pointless to stupid. Suffice it to say: stop pretending drawings are ever as bad as actual child abuse.
Oh it will go much different if the pork doesn't involve depuctions of children.
I don't care what you think. Stop equating drawings with rape.
Step up the reading comprehension please
Insult-dribbling troll repeats failed deflection.
Doesn't work this time either.
Just stop lying about this topic. It's kind of important.
Repeating over and over again that I'm equating drawings with rape isn't going to cut it if people can just read what I wrote. Especially when nobody was even talking about rape in the first place
Child sexual abuse is rape.
Thank you for demonstrating you have no idea what your words mean.
Hey, people can read what you write. Stop making stuff up.
The irony of you repeatedly sneering "reading comprehension" is delightful.
Yes, idiot, people can see my perfectly consistent point. Good job. I once told you to stop lumping together drawings with CSAM, and now, I'm still telling you stop lumping together drawings with CSAM.
CSAM is a product of rape. Child rape is part of the subject matter. That is the "child abuse" that's in the goddamn headline.
Except the dolts at Stanford lumped in drawings.
And you aggressively do not see the problem with treating those things the same way.
Even though one is a drawing, and the other is rape. You are... equating... those two things. You are treating them identically and interchangeably.
And you need to stop.
"treating them the same" => The threshold for being banned is just already crossed at the lolicon level.
From the perspective of the park, pissing in a pond and fighting a dude both get you thrown out. That doesn't mean you're "treating them the same". You're just the park.
Do you get it now?
You've absolutely treated them the same.
You see no problem with this study explicitly about CSAM casually lumping in... drawings.
The only reason you said shit to me in the first place was to smugly assert that they belong together. You declared confidently that the entire world was on your side, which does raise questions of who exactly is hosting and using these servers full of drawings you don't like.
But the issue is the false equivalence.
The only issue is the false equivalence.
Sounding the alarm about child rape shouldn't fucking include drawings for the same reason reports of a murder epidemic at the local park shouldn't throw in "and also someone pissed in the pond." Bit of a difference there! Kind of important! Negative one and negative one billion are both negative, but they're plainly not as negative, are they? They don't belong in the same sentence without a very narrow context. Glibly chastising someone, for pointing out that gulf, is the polar opposite of helpful, to anyone.
Saying 'so what, they're both bad' is false equivalence. It's dangerous nonsense that enables far-reaching abuse of power. What you're doing is the textbook basis for unjustifiable surveillance laws, censorship, and general moral policing. If your kneejerk reaction when someone belabors the difference between "we have to stop this proliferation of child rape!" and "we have to stop this proliferation of drawings we don't like!" is to insist you don't like either and you'd expect them to be treated the same - you are the problem.
I mean, not as much of a problem as child rape, but nonetheless, shut up already.
The study is transparent about their definition of CSAM. At this point, if you don't get it, you don't get it. Sorry dude.
"They're up-front about the false equivalence," says chronic clue-dodger.
You're really mad that a US based study is using the US definition of CSAM while also clearly stating the definition of CSAM they're using, aren't you?
Sure buddy, it's a "false equivalence", they're totally stating it's the same. It's not just your reading comprehension
I don't believe you're physically capable of self-consistently stating what you think your point is. You're somehow not equating these things you just said are defined as the same thing. You're sneering as if noticing that definition - which you block-quoted - is some kind of delusion.
All while projecting emotion in a way that's almost as ironic as your constant parroting of the phrase "reading comprehension."
Is it difficult to use the internet without object permanence?
Step up the reading comprehension please :)
Dull.
Oh no, what you describe is definitely illegal here in Canada. CSAM includes depictions here. Child sex dolls are illegal. And it should be that way because that stuff is disgusting.
Literally impossible.
Child rape cannot include drawings. You can't sexually assault a fictional character. Not "you musn't." You can't.
If you think the problem with child rape amounts to 'ew, gross,' fuck you. Your moral scale is broken, if there's not a vast gulf between those two bad things.