zlacker

[parent] [thread] 0 comments
1. derefr+(OP)[view] [source] 2023-07-26 19:48:48
> The current players can whitelist/attest their own clients while categorizing every other scraping clients as bots.

Can't they already do this by having scrapers send plain-old client certificates? Or even just a request header that contains an HMAC of the URL with a shared secret?

Actually, taking a step further back: why does anyone need to scrape their own properties? They can make up an arbitrary backchannel to access that data — just like the one Google uses to populate YouTube results into SERPs. No need to provide a usefully-scrapeable website at all.

[go to top]