zlacker

[parent] [thread] 2 comments
1. snowwr+(OP)[view] [source] 2023-07-02 04:37:13
Sites that wish to be ubiquitous must handle scraping in a way that is transparent to valuable users. It’s not like scraping is a new or complex threat to availability. This is table stakes for large services in 2023.

Twitter was very good at this, and their new-found inability is a glaring sign that their engineering is slipping.

replies(1): >>shon+j81
2. shon+j81[view] [source] 2023-07-02 15:51:50
>>snowwr+(OP)
Really? I disagree. Running a ubiquitous service that is good for users does not require that you allow every random person to scrape your site and incur that cost.

You simply setup API deals with those who you want to have your data, those that benefit your business, aka Google etc…

Then you close everything else up. This saves cost and complexity and real users, the target of your advertisers, don’t even notice.

This isn’t a sign that engineering is slipping.

It’s a sign that a in a company which struggles to make money, someone is paying attention and trying new things to fix the money problem.

replies(1): >>snowwr+ix3
◧◩
3. snowwr+ix3[view] [source] [discussion] 2023-07-03 12:18:26
>>shon+j81
> Running a ubiquitous service that is good for users does not require that you allow every random person to scrape your site and incur that cost.

What I said is that they must handle the problem transparently to their valuable users. That includes (requires, usually) targeted techniques to block high-volume scraping.

[go to top]