zlacker

[parent] [thread] 7 comments
1. matheu+(OP)[view] [source] 2023-05-31 23:22:05
Total non-issue. If it breaks, people will fix it. There's people out there maintaining immense ad filter lists and executable countermeasures against ad blocker detection. Someone somewhere will care enough to fix it.
replies(2): >>chromo+b5 >>nomel+L8
2. chromo+b5[view] [source] 2023-06-01 00:03:21
>>matheu+(OP)
There are only so many programmers, who will fix the client, per 1 person. This fraction, when inverted, will be a rough threshold for the client's audience size for continued fixes to be there.
replies(1): >>matheu+o6
◧◩
3. matheu+o6[view] [source] [discussion] 2023-06-01 00:14:49
>>chromo+b5
And yet these people somehow maintain immense amounts of ad blocking filters and code, including active counter measures which require reverse engineering web site javascripts. I gotta wonder what would happen if they started making custom clients for each website instead.
replies(1): >>chromo+x7
◧◩◪
4. chromo+x7[view] [source] [discussion] 2023-06-01 00:26:42
>>matheu+o6
Adblockers' audience is huge, much more than any single site's audience, and they probably wouldn't care about most single sites (to care, you have to be in the audience, and most sites have small audiences).
replies(1): >>matheu+ob
5. nomel+L8[view] [source] 2023-06-01 00:38:45
>>matheu+(OP)
> There's people out there maintaining immense ad filter lists and executable countermeasures against ad blocker detection.

This is not a useful comparison. A failure of an ad blocker means you don't see an ad while using the service. Big deal. A failure of a reverse engineered glorified web scraper is that the app stops working, completely, for all users of the client, at once, until someone fixes it.

Yes, it could be democratized, but most users wouldn't understand any of this, and say "ugh, this app never works". It would be a user experience that reddit could make as terrible as they wanted.

replies(1): >>matheu+3c
◧◩◪◨
6. matheu+ob[view] [source] [discussion] 2023-06-01 01:07:45
>>chromo+x7
Someone cared enough to defeat annoying blocker blockers of sites. If they care just a little bit more, they could replace the web developer's code with their own minimal version. Chances are the site doesn't actually need most of the code it includes anyway.

What I'm talking about already exists by the way. Stuff like nitter, teddit, youtube downloaders. I once wrote one for my school's shitty website.

◧◩
7. matheu+3c[view] [source] [discussion] 2023-06-01 01:14:03
>>nomel+L8
It absolutely is a useful comparison. It's obvious that this software depends on unstable interfaces that will eventually break. I wasn't talking about that, I was talking about the sheer effort it takes to create such things. Such efforts are absolutely in the realm of existence today. Projects like nitter and teddit exist. Teddit is on the frontpage of HN right now no doubt in reaction to this thread. There's probably one for HN too, I just haven't found HN to be hostile enough to search for it.

Honestly I don't really care about "most users". To me they're only relevant as entries in the anonymity set. As long as we have access to such powerful software, I'm happy. I'm not out to save everyone.

replies(1): >>nomel+0K2
◧◩◪
8. nomel+0K2[view] [source] [discussion] 2023-06-01 19:43:55
>>matheu+3c
> I was talking about the sheer effort it takes to create such things

I understand what you're saying, but I think this is the key to my point:

> It would be a user experience that reddit could make as terrible as they wanted.

It's an unfair cat and mouse game. Yes, effort could be made to fix it each time, but, if reddit chose, they could force everyone into the "most users" group, when the only app works for 5 minutes a day, and people get bored, because they decided to randomize page elements.

[go to top]