zlacker

[parent] [thread] 8 comments
1. steve_+(OP)[view] [source] 2023-07-26 19:04:54
The existence of a configuration that limits attestation to a probabilistic phenomenon seems like a very thin foundation to stand on here - if it can be changed to requiring 100% attestation rate in the future I think it will be changed as soon as it is feasible to do so.

I haven't reviewed the proposal enough to see how they implemented that, and if it was done in a cryptographic way that prevents changing to 100%, then that could work. But the fact remains that control of our browsing computing environment is diminishing under this proposal.

replies(1): >>haburk+U9
2. haburk+U9[view] [source] 2023-07-26 19:45:23
>>steve_+(OP)
It seems to me that “if it can be changed to 100% attestation rate in the future, it will be done” is a slippery slope argument and assuming bad faith on behalf of the proposal writer.

I think if it were changed to be 100% then it would be problematic. Also it seems the proposal writer would also agree that some form of opt out is required to make it viable so as to not forbid unknown clients.

I think its important to stay away from considering potential “what ifs” that completely defy the intent of the spec. For an example of why this isn’t effective discourse, we could have a potential addition to the spec to explicitly block users from certain countries. That’s not great but also its easy to understand why its not worth debating that point (even though it does sound scary).

replies(3): >>pwnna+0o >>Bizarr+Vp >>danShu+sB
◧◩
3. pwnna+0o[view] [source] [discussion] 2023-07-26 20:39:04
>>haburk+U9
I don't understand how a probabilistic holdbacks can be effective if you can requests for the attestation token multiple times. If the holdback percentage is 10%, the probability of getting no attestation for 10 calls in a row would be something like 0.1^10 = 1e-10. This seems trivial to implement and use to block users.

Granted, I don't fully understand how they intend to holdback, but even if they cache the results of the attestation such that 10 calls in a row fails to attest, they can't cache it infinitely. Website can employ traditional fingerprinting techniques/cookies in combination with attestation to build pretty foolproof systems to not serve the user based on attestation results.

replies(2): >>danShu+8B >>wzdd+uY2
◧◩
4. Bizarr+Vp[view] [source] [discussion] 2023-07-26 20:48:03
>>haburk+U9
If you had said that 5 years ago, I would have believed you, but the latest trend of megacorporations and billionaires moving against the interests of their users and ignoring their complaints has changed my stance on that.

Google is a big dog and will not care about our yapping. If it's allowed to do as it will without consequence then it will do so, and there is nothing that individuals can do to stop it other than to cease using their products.

◧◩◪
5. danShu+8B[view] [source] [discussion] 2023-07-26 21:40:08
>>pwnna+0o
This too. Maybe Google is willing to say something like "okay, for the duration of today, no WEI for you"; but unless they're doing something a lot more clever than the spec suggests, the "fallback" could very well be "retry the request until it succeeds and sends an attestation token."

Google would need to make holdbacks persistent enough that you couldn't retry them and get a different result. Even if they do, there are other problems, but... I mean, randomly failing requests is definitely not enough to guarantee that attestation would be optional. And there are no details I see in the spec that suggest to me that Google is planning to do something different.

replies(1): >>pwnna+y81
◧◩
6. danShu+sB[view] [source] [discussion] 2023-07-26 21:41:43
>>haburk+U9
Sometimes slippery slopes are real.

Part of the job of web specification development is to determine what potential bad actors could do in the future. If a spec was proposed that easily allowed blocking users from certain countries, I would want that listed in the potential risks. Mitigations and technical requirements are introduced into specs all the time that only exist to stop a potential future attack.

◧◩◪◨
7. pwnna+y81[view] [source] [discussion] 2023-07-27 01:09:43
>>danShu+8B
How would you even differentiate between retries? If you isolate it by domain, the website can redirect you 10 times, each collecting an attestation token. They could perform statistical analysis with cookies. Websites could even force logged in users to conform to a particular browser (banking apps already do this). It's difficult for me to understand how the authors can miss these implications. They even said that with holdbacks the websites can still perform statistical analysis. Statistical analysis is not just a tool for aggregate data. It can be applied to a single client with enough other identifiers.
◧◩◪
8. wzdd+uY2[view] [source] [discussion] 2023-07-27 14:58:42
>>pwnna+0o
The "explainer" does actually address this, by talking about "a small percentage of (client, site) pairs". In other words, a particular browser, going to a particular site, will always and forever either enable holdback or not.

So, no, you couldn't continually request attestation from the one site. Instead, you could create 20 separate top-sites and load them all in tiny iframes. :)

replies(1): >>danShu+cm4
◧◩◪◨
9. danShu+cm4[view] [source] [discussion] 2023-07-27 20:46:15
>>wzdd+uY2
Even if requests to separate domains didn't work, a 5% user loss is likely something that many websites can afford to ignore.

Remember that Firefox has at least a 3% marketshare. Safari has somewhere in the neighborhood of 20%. If websites are willing to go Chrome-only in that environment, permanent holdbacks won't change anything for those websites.

Particularly not if the solution to those holdbacks is "reinstall your browser and the holdback will probably go away." Which... they'd need to be unless Chrome starts tracking users to figure out who should have what holdbacks :)

The only way that holdbacks matter is if they affect 100% of Chrome users -- ie every single one of your customers/readers will at some point not send you attestation at some point for your website. And even then... telling them to refresh the page becomes a problem.

But it it's only a subset of users, then just banning 5% of users (especially from ad-supported platforms) seems perfectly feasible for a company and would probably be a preferred solution for some of them.

----

User: "Hey, for some reason when I browse Reddit nothing loads."

Support: "Yeah, very rarely a new Chrome install will do that. If you create an account and sign in, and then you send us some verification documents like an ID so we know you're not a scammer, then you'll still be able to browse. Otherwise just reinstall Chrome."

User: "Is there anything else I can do?"

Support: "No, we have to protect our ad integrity. If reinstalling the browser doesn't help, contact Google about it."

----

> Instead, you could create 20 separate top-sites and load them all in tiny iframes. :)

This too :)

[go to top]