this post was submitted on 01 Jun 2023
14 points (100.0% liked)

Technology

39241 readers
236 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
top 11 comments
sorted by: hot top controversial new old
[–] Gaywallet@beehaw.org 5 points 2 years ago* (last edited 2 years ago) (1 children)

You know, I've had an idea fermenting for some time now around how content moderation at scale might work. I have no idea if it's feasible, or not, nor do I have the technical expertise to bring it to fruition but I think the following pertinent points lead towards the capability of content moderation at scale:

  • The 90-9-1 rule doesn't just apply to lurking and commenting on websites, it's about participation on many facets. It's who creates videos, it's who volunteers to moderate, it's really all aspects of user interaction
  • People like to feel included and useful in communities and contribute in ways which work for them. For some, this is money, for some its the creation of art, some socialize, some connect, some offer goods and services, some trade, etc.
  • Moderating content doesn't have to be so centralized. The final call on moderation doesn't even necessarily have to revolve around a single individual - it can be a crowd-sourced decision (it often is groups of moderators having conversations on more nuanced or important issues already).
  • In-person content moderation, or communities policing behavior amongst itself often represents a lot of talking and a spread out reaction to an incident or incidents. Being a bad person in a small town might have many minor negative social consequences

When all of this combines, it makes you wonder if content moderation couldn't be accomplished more akin to how a small town might deal with a problematic individual - which is to say lots of small interactions with the problematic person, with some people helping, others chastising, some educating, their actions being more monitored, etc. How does this translate to a digital environment? That's the part I'm still trying to figure out. Perhaps comments that are problematic can be flagged by other users, such as in existing systems, but maybe this can fall into a queue where regular users or community members can vote on how appropriate it was and based on some kind of credit system (perhaps influenced by how much these people contribute or receive positive feedback in that particular community) determining the outcome of said comment. As it is, many of the conversational parts of this community feedback already happen (people both arguing with or pushing back against and educating or attempting to help). A system might even encourage or link up users with appropriate self-flagged educators who can talk directly with problematic individuals to help them learn and grow. Honestly, I don't know all the specifics, but I think it's interesting to think about.

[–] BrooklynMan@beehaw.org 4 points 2 years ago* (last edited 2 years ago) (1 children)

Content moderation at scale is impossible to do well. Importantly, this is not an argument that we should throw up our hands and do nothing. Nor is it an argument that companies can’t do better jobs within their own content moderation efforts. But I do think there’s a huge problem in that many people — including many politicians and journalists — seem to expect that these companies not only can, but should, strive for a level of content moderation that is simply impossible to reach.

anyone who has tried to moderate an online community of any scale can relate to this, especially when looking at the downfall of traditional social media sites like facebook and twitter.

[–] alyaza@beehaw.org 5 points 2 years ago* (last edited 2 years ago)

anyone who has tried to moderate an online community of any scale can relate to this, especially when looking at the downfall of traditional social media sites like facebook and twitter.

in the early days here moderating was kind of a nightmare like on those sites because we'd semi-frequently get spammed by neo-nazis and trolls who posted all sorts of heinous stuff. we'd also get a ton of spam. (it's part of why we don't have open registrations.) just so much work for dubious payoff--luckily it's worked out in the end here

[–] ozoned@beehaw.org 3 points 2 years ago

Fedi FTW! :-D \o/

Soon they'll realise this applies to governments too.

load more comments
view more: next ›