zlacker

[return to "Moderation is different from censorship"]
1. Silver+hP[view] [source] 2022-11-03 10:51:09
>>feross+(OP)
I think something that really bothers me about this discussion about moderation is how many people approach this debate like a new born baby. They have an idea and then speculate on how it fixes everything. There's never any discussion of what exists in the real world. ACX here is essentially describing some key attributes of reddit. Each sub-reddit has it's own moderation team that decides what's acceptable and then you opt-in. This is pretty close to what ACX is proposing.

So let's look at what happened in reality. Almost immediately sub-reddits pop up that are at the very least attempting to skirt the law, and often directly breaching the law- popular topics on reddit included creative interpretations of the age of consent for example, or indeed the requirement for consent at all. Oh and because anyone can create one these communities, the site turns into whack-a-mole.

The second thing that happened was communities popped up pretty much for the sole purpose of harassing's other communities. But enabling this sort of market place of moderation, you are providing a mechanism for a group of people to organize a way to attack your own platform. So now you have to step back in and we're back to censorship.

I also think that this article completely mischaracterizes what the free speech side of the debate want.

◧◩
2. fallin+iZ[view] [source] 2022-11-03 12:15:50
>>Silver+hP
Moderation isn't supposed to prevent illegal content, law enforcement is. So that's out of scope. The harassment problem is supposed to prevent harassment, but this is just a failure of reddit to provide the correct moderation tools to block organized harassment, not a failure of the concept.
◧◩◪
3. PaulHo+Ou1[view] [source] 2022-11-03 14:47:14
>>fallin+iZ
Do you want cops filtering through all online sites looking for child porn? Do you think that's a good use for their time?

It's the threat of law enforcement that leads people who run websites to remove illegal content.

Generically (to, say, please advertisers) that is an expectation that sites are going to be proactive about removing offensive (or illegal) material. Simply responding on a "whack-a-mole" basis is not good enough. I ran a site that had something like 1-in-10,000 offensive (not illegal... but images of dead nazis, people with terrible tumors on their genitals, etc.) images and that was not clean enough for Adsense. From the viewpoint of quality control, particularly the Deming viewpoint of statistical quality control, it is an absolute bear of a problem to find offensive images at that level -- and look at how many people write a paper about some A.I. program that gets 70% accuracy is state of the art.

[go to top]