Publications like NYT, WSJ, the Economist, and the New Yorker have paywalls that leave ways for readers to work around them. Such stories are OK to post to Hacker News. Yes, it sucks, but losing that many substantive articles would suck worse. In the future, when someone doesn't understand this, please direct them to this thread or to HN's FAQ [1], which now makes this explicit.
Complaints about paywalls are off topic, so please don't post them. The spirit of HN is to discuss specific articles and avoid generic rehashing. Arguments about The Paywall Question are all the same. For an example of what we want to avoid, see [2]. For more on our thinking, see [3].
It's ok to ask how to read an article or to help other users by sharing a workaround. But please do this without going on about paywalls. Focus on the content.
1. https://news.ycombinator.com/newsfaq.html
2. https://news.ycombinator.com/item?id=10178012
3. https://hn.algolia.com/?query=by:dang%20paywall&sort=byDate&...
The announcement-mills (phys.org comes to mind but there are plenty of others including nature.com itself) are not really "original" sources, the papers are, but such announcement-advertisement articles are submitted regularly.
Finding the freely available pre-print and/or author provided copies without resorting to (ahem) other workarounds is a pain but useful.
I would call pasting-the-URL-into-Google-Search less of a intentional workaround and more of a trick to take advantage of the websites' compliance with Google rules.
Not every HN reader would know to do that, or look in the comments for that "workaround."
That's right, so it's ok for people to ask and share how to read an article in the comments. There shouldn't need to be more than one or two comments about this, and it helps everyone focus on the content.
What's off-topic is the generic tangent of paywall complaining.
The best way to keep these "off-topic" comments off HN is to make the sacrifice and stop rewarding those sites with traffic.
Since almost all paywalled articles are from the WSJ, the Economist, or the NYT, this shouldn't happen to you very often.
The goal of this move isn't "stop people from talking about paywalls at all times". The goal here is to increase the average quality of discussion that takes place, when a paywall article is submitted. Different things.
"Workarounds" which are already incorporated into the link by the time I see them are fine. Workarounds that require me to download seven apps and swing a chicken over my head while ROT13'ing the URL are... not.
It is the site's job (HN's job) to provide usable links to its readers. That is literally your only job, the only thing you are here for. If it fails, it fails, and it should be criticized for that.
If you're posting a story which began with a press release, it's better to find the original press release (probably on PR Newswire) and link to that. At least you can read the hype before it was munged by some minimum-wage Demand Media employee.
Having links on the site fail arbitrarily devalues the entire page. Users aren't stopping and thinking "hey, is clicking this link going to waste my time?" - which results in the entire system being perceived as less reliable and trustworthy.
I agree that such discussions are off-topic, but is there a better way to handle these articles than "RTFM, noob"?
Of course the paywalls suck. Is there any user who has to deal with more of these annoyances than we ourselves do? There can't be many.
The question is the lesser of two evils. Anyone who doesn't get what a disaster it would be for HN to lose the NYT, WSJ, Economist, and New Yorker doesn't get HN in the first place.
The links are just huge wastes of time. A prominent tag attached to the article would be ok, but in the absence of any other feature to avoid these time sinks, it makes sense to flag the articles to save others from additional wastage.
What will you do when every article on the homepage is paywalled?
What will you do when users provide free mirrors, either pasted in the comments section or hosted elsewhere?
Will you be providing easy-to-use guides for users (new or otherwise) on how to effectively utilize such workarounds?
I'd like to add my voice to the calls for some kind of flair obviating that a submitted link leads to paywalled content, so that I may avoid such links.
I'd much rather read a primary source than read a blog that summarizes a splog that links to another blog that wrote about a headline that appeared in the NY Times.
I pay for NYTimes, WSJ, ACM Digital Library, etc. And most often the best information is from these sites.
But while we're at it, how about a [cache] link? More often than not we end up overloading the submitted site and there's always a comment with Google Cache link.
The objections seem to be largely ideological, and reciting ideology is the essence of uninteresting in HN's sense. If someone comes up with something new and clever to say about paywalls, by all means post it as a story and let the community have at it. Repeating complaints for the zillionth time, not so much.
Edit: I understand it could be controversial for HN to officially push links that pierce paywalls. But a tag users can avoid wasting time would be appropriate, just like a link to a discussion of common workarounds.
Sometimes people post these and others respond with links to freely available versions, or articles about the work. In such cases we're happy to update the URLs.
We're not happy about announcement mills either (and those sites are penalized on HN), but that's arguably a separate problem.
If it wasn't, you wouldn't be implementing a policy to stop people from talking about it.
I think any talk about the responsibilities of an ad-free site to its readers, much less a statement that its job is to behave in the way that one of its readers prefers, is probably presumptuous at best.
No, the site's job is to aggregate links and provide a platform for discussion. Providing usable links is the responsibility of the people posting the links - other readers. Links which don't generate quality discussion will be killed any number of ways, but the quality of user-generated content is entirely on the users generating it.
In this case, the value of being able to discuss certain kinds of content with minor workarounds is greater than not having that content at all.
"Paywalls with workarounds" means people can read them. Obviously we care about that—we've explicitly let everyone know that users are welcome to help each other do so.
Re value, people disagree about value judgments but someone has to make the call, and it's the same now as it has always been.
I can at least tell you what it's based on: HN wants to maximize the quality of the articles on the front page and the quality of the comments in the threads. Sites like the NYT and the New Yorker increase the former. Repetitive complaining about paywalls reduces the latter. Hence the above.
Why should someone else spend all the effort to code up a solution to your problem when you believe it is beneath you to click on a link and find out for yourself?
EDIT: One solution would be to use a link to a webcache, or screenshot, waybackmachine, or similar
The news sites have made the economic calculation that allowing access to traffic from content aggregators like Google (which is the price of being discoverable by Google) is worthwhile.
The idea that only sufficiently large aggregators/traffic sources should get a special pass seems preposterous; anyone trying to enforce would be engaged in downright anticompetitive behavior.
The cat is already dead, can we please open the box & acknowledge the source of the foul smell?
I was thinking more of Elsevier, which actively sues websites that publish mirrors of papers (see https://torrentfreak.com/elsevier-cracks-down-on-pirated-sci...).
It was a polite feature request, asked civilly.
Thank you for your contributions to this site.
B: Even Google gets on the geo-IP bus and will outright 404 pages in some zones with zero indication that the URL was ever OK. Recent example was the Google Solar Roof thing - in some countries, it was just a 404, nothing else.
We'll all do whatever the overall population of the web ends up doing, should this come to pass. Which will be some combination of: subscribe, subscribe to an aggregated subscription (like cable TV), or don't read it.
I'll keep refcontrol in mind next time it happens, thanks. I'm not sure why setting the header this way would work when clicking directly on the link or the google search page doesn't, but heck I have no model for this behavior anymore.
- Just don't read the article.
- Subscribe. If you can't/won't afford it, then see above, or see below.
- Search for other sources of the information. And post them, it adds to the discussion. Most articles worth taking up space, particularly on paywalled sites, are worth that space in other venues. Almost nothing is exclusive, not after a day anyway.
In the WSJ case, I've noticed that yahoo often prints the article verbatim.
1. People complaining about paywalls
2. People complaining about poor quality content
I'd wager that most HN users are using AdBlock as well. How do you reconcile this with the above complaints? I'm sure some users restrict AdBlock on certain sites, but I suspect it's far from the majority.
A significant portion of the HN community are specifically building websites intended to make money. Perhaps the majority in the past, before the Elves left the forest.
What's special about news sites, that compels people to complain about them popping up on HN? If it's really a bad thing, then shouldn't we be complaining about non-news sites that make money (or are trying to?).
Isn't every YC company trying to make money, and charging for what their website offers?
Sheesh.
I don't get it? No, you don't get it.
So maybe it's OK to require a proxy from China? But what if a user in North Korea can't access a proxy? Should that link also be prohibited on HN? How is HN supposed to know what consists of an acceptable workaround, and what does not? Are the standards different for different countries?
It seems simpler to assume that the articles that get upvotes have people who want to read them, and that if people want to read the article everyone is raving about, they will find a way. It's Hacker News after all.
This means everyone can read it.
Also, I'll be creating a scraper that analyzes users' comment histories to determine when they complain about paywalls if they've ever complained about the state of journalism or scientific funding. If they have, I will link to the evidence so they can be duly down-voted, ridiculed, and shamed.
For two, declaring rules and then declaring that no-one is allowed to talk about said rules sets a very dangerous precedent.
For three, pretty darn ironic that both this and http://deathtobullshit.com/ are on the front page at the same time.
So anyway, what I complain about are sites that load the paywall at first visit. If the site allows five free articles per month, why not just wait until the sixth?
This problem will get a lot easier when we have a way to group the related URLs for a story. That's something we are eventually going to work on. Indeed, I wonder if it couldn't turn into a broader solution to the paywall question.
This statement is up for debate. I don't get why you continue to declare this like it is settled. Paywalls do limit access to content, no matter how easy or numerous the workarounds. In my opinion, the site suffers when paywalled links are posted. I don't think it's a just-so story that NYT/Economist/WSJ links are so important to HN that we simply must suffer their existence.
Declaring discussion of paywalls thoughtcrime is not good for the community. Suggesting people who can flag should not flag because they should figure out which of the 18 different workarounds they can use to read content is also not appropriate.
> Just so it's clear: this is a sure way to lose your flagging privileges on HN
Wow. So flagging an article that you can't read (by design), is a way to lose the ability to flag. What exactly are you supposed to flag, then? If you can't flag "this page just asks me for a credit card", then what exactly can you flag?
It's also easily derived from the values of this site, which are no secret (see https://news.ycombinator.com/item?id=10179248) and have not changed.
It's good to be the king.
I still don't see any ad hominem, and as a statement of HN's very specific values it seems obvious to me, but you're right that I shouldn't have said it in a dismissive way.
Because it directly affects this community. It's a link aggregator site for heaven's sake.
> If you run into a link and discover it's to pay-walled content simply move on, you've only wasted like 3 seconds of your life.
I could take this argument ad infinitum. Why didn't you just move on instead of posting a comment here? Why does anyone say anything critical ever instead of just moving on?
> Also, I'll be creating a scraper that analyzes users' comment histories to determine when they complain about paywalls if they've ever complained about the state of journalism or scientific funding. If they have, I will link to the evidence so they can be duly down-voted, ridiculed, and shamed.
Great, vigilante justice and public ridicule over a topic you supposedly don't care one whit about.
Because they run the site and, I imagine, are representing the broader group who ultimately make decisions about it?
It very much reminds me of an ostrich sticking its head in the sand - namely, that the response of a link aggregator to more and more of the links it aggregates disappearing is to say "no it's not, there are <insert increasingly complex and increasingly quasi-legal workarounds here>, now stop complaining".
It's not just now that should be worrying, it's the continuation of the trend. And a website as major as this within its domain is one of the few that has more than a snowball's chance in Hell to divert said trend, assuming it acts in a timely fashion. But this is just waiting around like a lobster in a pot of water being slowly heated.
I've got a specific use case that's not applicable to most HN readers - here in .au Popular Science links are useless - they do a geo redirect based on your ip address, which redirects me to the homepage of the .com.au version of their site, which in general doesn't even have the original article available...
I'm occasionally tempted to think that HN should in the same direction: no links whatsoever, everything is plain text. You want to read the article, you cut-and-paste. Or write your own browser extension, or whistle it into a cell-phone or something. Terrible for rapid reading, but would definitely cut down on the complaining about paywall tags. One-click links probably violate some Amazon patent anyway.
Taking it a step farther, all submissions must be done rot13. If you can't figure out how to translate a link to rot13 (or install an appropriate browser extension), maybe you shouldn't be posting here. Not because you are inherently unworthy, but because you haven't bothered to read and follow the instructions. The instructions could be given on the bottom of the guidelines page, and all improperly formatted submissions could redirect to the guidelines: https://news.ycombinator.com/newsguidelines.html
I feel like it's either that, or more all-Erlang days:
You can help the spike subside by making HN look extra
boring. For the next couple days it would be better to have
posts about the innards of Erlang than women who create
sites to get hired by Twitter.
pg, https://news.ycombinator.com/item?id=512145And while you are at it, get off my lawn. :)
> they should figure out which of the 18 different workarounds they can use
Workarounds are a nuisance but this exaggerates it. Overwhelmingly these articles come from a small number of sites that have the same few workarounds. Most people have internalized them long ago (or installed software to do so), and for anyone who hasn't, it's fine to share info like "open an incognito window" or "google the article title" in the threads. What's not fine is to turn every thread into the same old argument about paywalls.
> what exactly can you flag
You should flag things that shouldn't be on HN in the first place. But a New Yorker article on, say, Nabokov and butterflies obviously should be on HN. (Obviously, that is, given the mandate and history of the site.) Articles on offbeat topics that lead outside HN's core grooves are the most endangered species here. We need more of those. Flagging them is an abuse of flagging. Sometimes people do that because of paywalls, even when the paywall has a trivial workaround like an incognito window. That's what I was referring to.
Intellectual diversity is the founding value of this site: https://news.ycombinator.com/hackernews.html. That's what I meant re "getting" HN. But I'll try to be more helpful than dismissive when communicating it.
Rather than banning the crappy discussion, why not ban the articles that result in it?
And many ads are paid per view, not per click.
Hit the "Raw" button on https://github.com/voltagex/hackernews-paywalltag/blob/maste... - linking directly is not a good user experience for anyone using GreaseMonkey
But let me take a crack at this. No, pasting the full text of an article directly into the thread is not a good workaround. First, it gums up the thread. Second, obvious copyright issues.
Therefore, if there's a standard workaround like "incognito window", "turn cookies off", or "google the article title", the way to help people is to teach them that. If none of those things work, linking to a different way to read the content (such as a Google cache link or an archive link) is probably ok. Beyond that my crystal ball gets cloudy.
Also, the reason why said discussions are starting to "choke other threads like weeds" is because it is rather hard to have a discussion about something when one has to resort to increasingly-complex and quasi-legal methods (if not downright illegal in some places) just to read the content people are trying to have a discussion about.
I don't really want to write RES for HackerNews, but it's an interesting project for the last week of my holidays.
Articles from sites that are accessible to a limited group of people have no place here. Instead, they should be discussed in the comments of the article itself.
Should we also stop what's you favourite book threads since they are mostly paywalled or ask if we are allowed to link to the torrents?
Obviously the discussions on such articles aren't all crappy. Often they're good. That doesn't mean that off-topic generic tangents about paywalls aren't a problem. All generic tangents are a problem, and this was an increasingly common one.
It astonishes me how the people making objections in this discussion ignore that we're talking about articles that are possible for nearly everyone to read. That's what paywalls with workarounds means. It means readable with a bit of a nuisance.
There have been a few legitimate counterpoints—for example, if it's true that in some countries you can't google WSJ articles to read them, that's a problem. But mostly this argument has charged ahead as if we were talking about unreadable content, with lots of indignant points being made on that basis and little stopping to notice that it's false.
edit: The auto-generated bypass instructions will get the top-sorted/top-comment favoritism that we normally try to avoid from users.
If sites don't want people to bypass paywalls, then they would not allow "special" ways to bypass paywalls. The fact that some paywalls have special referrer bypass rules reeks of financially motivated favoritism and entrenched interests preventing competition; the next search engine startup to be created is going to have a rough time of it.
As much as I hate to admit it, the sad state of suckage for announcement mills (including university press sites) actually does have some minor advantages; which would you be more inclined to read and up-vote?
"Astronomers detect furthest galaxy yet with Keck telescope"
or
"Lyman-Alpha Emission From A Luminous Z=8.68 Galaxy: Implications For Galaxies As Tracers Of Cosmic Reionization"
Non-Astronomers would be lucky if they understand the details presented in just the abstract of the paper, and I say this as a non-astronomer who does _NOT_ understand all of said details. Reading original source papers takes far more effort than reading lightweight announcements, and this gets to the fundamental question of, "What do we want HN to be?"
The status quo of interested HN users finding and comment-linking to the original source papers (if available) on the puff-piece stories is a lot of manual work and some stuff gets missed, but it really does tend to work out reasonably well. If we forbid paywalls without workarounds and require original sources, then we will miss out on a lot of great new research. Besides infringement, there is no easy answer for this situation.
Of the flood of links posted to /newest the paywalled links are nowhere near the most problematic.
It astonishes me how this policy is so favorable to a money sucking strategy yet it ignores the myriad of other usability complaints that frequently pop up (e.g. Why is/isn't this on medium, wtf is this scroll jacking, why is the js so big, why is the font so small/big).
None of the workarounds cost anything—that's what "workaround" means. Your comment is a good example of what I was talking about: indignation blithely proceeding on a false premise without stopping to consider it. The fact is that these articles are freely accessible with a bit of work. Had you said "time-sucking", you'd have had a point.
> ignores the myriad of other usability complaints that frequently pop up
You're right that those are also off-topic and mostly of little value. But we can't come up with a complete set of rules to cover everything under all cases. Even if we could, the community would reject it, and even if they didn't, what a miserable way to live.
Nobody cares about sites trying to make money. What people complain about is when it means an HN reader has to directly pay to participate.
Yes, over time you learn things.
Indeed it would, which is why it's not happening. That's already clear from the title: "workarounds" means the only price is a bit of annoyance.
I'm not sure the best way to monetize that but a paywall on a single domain sort of misses how 99% of the audience actually uses the site.
Cookies and private windows only work because the sites have a free views counter. They could stop at zero if they wanted.
I can't see the HN mods advocating anything illegal.
If every article on the homepage was paywalled, it would because the balance had tipped in sites' favor, and they no longer feel compelled to allow workarounds. Almost everyone that you might think could charge for their service would be charging, the result being every/most articles on HN's homepage paywalled.
In that future, people would commonly subscribe to news sites. I can imaging that aggregating subscriptions services would come to be, something like cable, where you pay one low-ish price and have access to lots of sites, without having to manage individual subscriptions.
And now that I think about that more, that could end up replacing what cable is now, and the giant broadband companies would either become the dumb pipes that they truly are, or become those subscription aggregators.