If websites implement this, it will effectively make building a web search engine impossible for new entrants. The current players can whitelist/attest their own clients while categorizing every other scraping clients as bots.
If not for other reasons, I can't see how Google a search company can be allowed to push something that can kill competition using its market dominance in other areas like browsers.
Can't they already do this by having scrapers send plain-old client certificates? Or even just a request header that contains an HMAC of the URL with a shared secret?
Actually, taking a step further back: why does anyone need to scrape their own properties? They can make up an arbitrary backchannel to access that data — just like the one Google uses to populate YouTube results into SERPs. No need to provide a usefully-scrapeable website at all.