this post was submitted on 31 Dec 2025
158 points (98.8% liked)
Technology
41200 readers
177 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
True. The profit motive is. People pushing harmful content are doing it because it makes them money, not because they're twirling their moustaches as they relish their evil deeds. You remove the profit motive, you remove the motivation to harm people for profit.
The algorithms boost engagement according to 1) what people engage with, and 2) what companies assess to be appealing. Facebook took the lead in having the social media platform own the engagement algorithms, but the companies and people pushing the content can and do also have their own algorithmic targeting. Just as Joe Camel existed before social media and still got to kids (and not just on TV), harmful actors will find and join discords. All that Facebook and Twitter did was handle the targeting for them, but it's not like the targeting doesn't exist without the platforms' assistance.
It wasn't as bad on those... back when we were teens. It absolutely is now. If anything, you'll usually find that a lot of the most harmful groups (red-pill/ manosphere, body-image- especially based around inducing EDs- influencers) actually operate their own discords that they steer/ capture kids into. They make contact elsewhere, then get them into a more insular space where they can be more extreme and forceful in pushing their products, out of public view.
I'm not saying it's all individuals, I'm saying the opposite; it's companies. Just not social media companies. Social media companies are the convenient access vector for the companies actually selling and pushing the harmful products and corollary ideas that drive kids to them.
Given that your immediate solution was to regulate kids instead of regulating companies, I don't think you're going to be interested in my solutions.