It’s not secret, because they’ll be provided an answer if they email the mod team.
It’s not free as in open source, because it isn’t available for anyone to download and study in full.
So, since it’s not secret, is it public, or private? Since it’s not published in full but any query of LIMIT 1 is answered, is that open, closed, or other?
Restrictions to publication don’t necessarily equate to secrecy, but the best I’ve got is “available upon request”, which isn’t quite right either. Suggestions welcome.
The opposite would be to show the author of the content some indicator that it's been removed, and I would call that transparent or disclosed moderation.
Interestingly, your comment first appeared to me as "* * *" with no author [2]. I wonder if that is some kind of ban.
[1] https://www.youtube.com/watch?v=8e6BIkKBZpg
[2] https://i.imgur.com/oGnXc6W.png
edit I know you commented again but it's got that "* * *" thing again:
"This domain is not allowed on HN" as an error message upon submission.
Re the 'delay' setting see https://news.ycombinator.com/newsfaq.html.
If you're going to censor someone, you owe it to them to be honest about what you're doing to them.
(Even when doing the RightThing(TM) would probably be easier...)
And, BTW, I occasionally get blocked by the mechanisms here, even though not doing anything bad, but understand that there is a trade-off.
unless HN is suddenly the government what you've misnomered is moderation, not censorship. Calling censorship just exaggerates your opinion and makes you look unhinged. It's a private website not national news.
I really like this take on moderation:
"The essential truth of every social network is that the product is content moderation, and everyone hates the people who decide how content moderation works. Content moderation is what Twitter makes — it is the thing that defines the user experience."
From Nilay Patel in https://www.theverge.com/2022/10/28/23428132/elon-musk-twitt...
The parts that don't work especially well, most particularly discussion of difficult-but-important topics (in my view) ... have also been acknowledged by its creator pg (Paul Graham) and mods (publicly, dang, though there are a few others).
In general: if you submit a story and it doesn't go well, drop a note to the moderators: hn@ycombinator.com. They typically reply within a few hours, perhaps a day or if things are busy or for complex.
You can verify that a submission did or didn't go through by checking on the link from an unauthenticated (logged-out) session.
Moderation is the removal of content that objectively doesn’t belong in context, eg spam
Obviously that moderation definition is nuanced bc some could argue that Marxist ideas don’t belong in the context of a site with a foundation in startups. And indeed Marxist ideas often get flagged here
I suppose a sufficiently motivated spammer might incorporate that as a submission workflow check.
> Moderation is the normal business activity of ensuring that your customers like using your product. If a customer doesn’t want to receive harassing messages, or to be exposed to disinformation, then a business can provide them the service of a harassment-and-disinformation-free platform.
> Censorship is the abnormal activity of ensuring that people in power approve of the information on your platform, regardless of what your customers want. If the sender wants to send a message and the receiver wants to receive it, but some third party bans the exchange of information, that’s censorship.
Censorship is somewhat subjective, something that you might find offensive and want moderated might not be considered so by others. Therefore, Alexander further argues that the simplest mechanism that turns censorship into moderation is a switch that, when enabled, lets you see the banned content, which is exactly what HN does. Alexander further argues that there are kinds of censorship that aren't necessarily bad, by this definition, disallowing pedophiles from sharing child porn with each other is censorship, but it's something that we should still do.
[1] https://astralcodexten.substack.com/p/moderation-is-differen...
Operators of public sites should NOT have to pay that tax. So you are best are not fully aware of the actual cost, IMHO.
Congrats to HN for striking a reasonable pragmatic balance.
*I had some of the first live (non-academic) Internet connectivity in the UK, and the very very first packets were hacking attempts...
Blame the trolls that prevent us from having nice things.
> You can verify that a submission did or didn't go through by checking on the link from an unauthenticated (logged-out) session.
Trustful users do not think to do this, and it would not be necessary if the system did not keep the mod action secret.
Those who have been advised to do so, through the Guidelines, FAQ, comments, or moderator notes, do, to their advantage.
(I'd had a submission shadowbanned as it came from the notoriously flameworthy site LinkedIn a month or few back. I noticed this, emailed the mods, and got that post un-banned. Just to note that the process is in place, and does work.)
I've done this on multiple occasions, e.g.: <>>36191005 >
As I commented above, HN operates through indirect and oblique means. Ultimately it is is a social site managed through culture. And the way that this culture is expressed and communicated is largely through various communications --- the site FAQ and guidelines, dang's very, very, very many moderation comments. Searching for his comments with "please" is a good way to find those, though you can simply browse his comment history:
- "please" by dang: <https://hn.algolia.com/?dateRange=pastYear&page=0&prefix=tru...>
- dang's comment history: <https://news.ycombinator.com/threads?id=dang>
Yes, it means that people's feelings get hurt. I started off here (a dozen years ago) feeling somewhat the outsider. I've come to understand and appreciate the site. It's maintained both operation and quality for some sixteen years, which is an amazing run. If you go back through history, say, a decade ago, quality and topicality of both posts and discussions are remarkably stable: <https://news.ycombinator.com/front?day=2013-08-14>.
If you do have further concerns, raise them with dang via email: <hn@ycombinator.com> He does respond, he's quite patient, might take a day or two for a more complex issue, but it will happen.
And yes, it's slow, inefficient, and lossy. But, again as the site's history shows, it mostly just works, and changing that would be a glaring case of Chesterton's Fence: <https://hn.algolia.com/?q=chesterton%27s+fence>.
But that's selective education. You don't do it for every shadow moderated comment. The trend is still that shadow moderation more often disadvantages trustful users. Will you acknowledge that harm?
Over 50% of Reddit users have a removed comment in their recent history that they likely were not told about. When shadow moderation is in play, abuse runs rampant among both mods and users. Both find more and more reasons to distrust each other.
It's quite possible the reason the list isn't public is because it would give away information about what thought is allowed and what thought isn't.
How do you think spammers and abusers will exploit those options?
Again: HN works in general, and the historical record strongly confirms this, especially as compared with alternative platforms, Reddit included, which seems to be suffering its own failure modes presently.
A forum should not do things that elbow out trustful people.
That means, don't lie to authors about their actioned content. Forums should show authors the same view that moderators get. If a post has been removed, de-amplified, or otherwise altered in the view for other users, then the forum should indicate that to the post's author.
> How do you think spammers and abusers will exploit those options?
Spammers already get around and exploit all of Reddit's secretive measures. Mods regularly post to r/ModSupport about how users have circumvented bans. Now they're asking forums to require ID [1].
Once shadow moderation exists on a forum, spammers can then create their own popular groups that remove truthful content.
Forums that implement shadow moderation are not belling cats. They sharpen cats' claws.
The fact that some spammers overcome some countermeasures in no way demonstrates that:
- All spammers overcome all countermeasures.
- That spam wouldn't be far worse without those countermeasures.[1]
- That removing such blocks and practices would improve overall site quality.
I've long experience online (going on 40 years), I've designed content moderation systems, served in ops roles on multi-million-member social networks, and done analysis of several extant networks (Google+, Ello, and Hacker News, amongst them), as well as observed what happens, and does and doesn't work, across many others.
Your quest may be well-intentioned, but it's exceedingly poorly conceived.
________________________________
Notes:
1. This is the eternal conflict of preventive measures and demonstrating efficacy. Proving that adverse circumstances would have occurred in the absence of prophilactic action is of necessity proving a counterfactual. Absent some testing regime (and even then) there's little evidence to provide. The fire that didn't happen, the deaths that didn't occur, the thefts that weren't realised, etc. HN could publish information on total submissions and automated rejections. There's the inherent problem as well of classifying submitters. Even long-lived accounts get banned (search: <https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...>). Content moderation isn't a comic-book superhero saga where orientation of the good guys and bad guys is obvious. (Great comment on this: <>>26619006 >).
Real life is complicated. People are shades of grey, not black or white. They change over time: "Die a hero or live long enough to become a villian." Credentials get co-opted. And for most accounts, courtesy of long-tail distributions, data are exceedingly thin: about half of all HN front-page stories come from accounts with only one submission in the Front Page archive, based on my own analysis of same. They may have a broader submission history, yes, but the same distribution applies there where many, and almost always most submissions come from people with painfully thin history on which to judge them. And that's assuming that the tools for doing said judging are developed.
You asked me for an alternative and I gave one.
You yourself have expressed concern over HN silently re-weighting topics [1].
You don't see transparent moderation as a solution to that?
> The fact that some spammers overcome some countermeasures in no way demonstrates that...
Once a spammer knows the system he can create infinite amounts of content. When a forum keeps mod actions secret, that benefits a handful of people.
We already established that secrecy elbows out trustful people, right? Or, do you dispute that? I've answered many of your questions. Please answer this one of mine.
> That removing such blocks and practices would improve overall site quality.
To clarify my own shade of grey, I do not support shadow moderation. I support transparent-to-the-author content moderation. I also support the legal right for forums to implement shadow moderation.
[1] >>36435312