If websites implement this, it will effectively make building a web search engine impossible for new entrants. The current players can whitelist/attest their own clients while categorizing every other scraping clients as bots.
If not for other reasons, I can't see how Google a search company can be allowed to push something that can kill competition using its market dominance in other areas like browsers.
https://github.com/RupertBenWiser/Web-Environment-Integrity/...
> However, a holdback also has significant drawbacks. In our use cases and capabilities survey, we have identified a number of critical use cases for deterministic platform integrity attestation. These use cases currently rely on client fingerprinting. A deterministic but limited-entropy attestation would obviate the need for invasive fingerprinting here, and has the potential to usher in more privacy-positive practices in the long-term.
I think any holdback will eventually go away because of the "critical use cases for deterministic platform integrity attestation"