zlacker

[return to "Web Environment Integrity Explainer"]
1. jchw+G5[view] [source] 2023-07-19 13:22:42
>>christ+(OP)
Absolute worst spec I've ever seen. Google needs to be loaded into a cannon and fired into the sun.

> How does this affect browser modifications and extensions?

> Web Environment Integrity attests the legitimacy of the underlying hardware and software stack, it does not restrict the indicated application’s functionality: E.g. if the browser allows extensions, the user may use extensions; if a browser is modified, the modified browser can still request Web Environment Integrity attestation.

Then what's the point? I can make modified bot browser that commits ad fraud as long as I don't use a rooted Android phone?

I don't believe they're being honest with how this will be used. We need to legally regulate remote attestation.

> As new browsers are introduced, they would need to demonstrate to attesters (a relatively small group) that they pass the bar, but they wouldn't need to convince all the websites in the world.

It speaks for itself. Horrid.

◧◩
2. hoover+Jn[view] [source] 2023-07-19 14:30:32
>>jchw+G5
Trusted computing is all about ensuring that your machine is trusted to run payloads and you can't observe or interact with them. Sad! I see why the free software people call it treacherous computing
◧◩◪
3. mike_h+MK[view] [source] 2023-07-19 15:51:57
>>hoover+Jn
TC is value neutral so the FSF slurs don't make sense. Consider what happens when the machine in question is a cloud VM. Then you can run workloads on a rented machine without the risk of the cloud vendor spying on or tampering with your server. Likewise if the machine gets hacked. These are highly desirable properties for many people. For example Signal uses TC so the mobile apps can verify the servers before doing contact list intersection, keeping the contacts private from the Signal operators.

Another use case is multiparty computation. Three people wish to compare some values without a risk that anyone will see the combined data. TC can do this with tractable compute overhead, unlike purely cryptographic techniques.

Observe what this means for P2P applications. A major difficulty in building them is that peers can't trust each other, so you have to rely on complex and unintuitive algorithms (e.g. block chains) or duplication of work (e.g. SETI@Home) or benign dictators (e.g. Tor) to try and stop cheating. With TC peers can attest to each other and form a network with known behavior, meaning devs can add features rather than spend all their time designing around complicated attacks.

These uses require you have a computer that you do trust which can audit the remote server before uploading data to it. But you can compile and/or run that program on your laptop or smartphone, the verification process is easy.

But exactly because TC is general it doesn't distinguish based on who owns the machine. It doesn't see your PC as morally superior to a remote server, they're all just computers. So yes, in theory a remote server could demand you run something locally and then do a HW remote attestation for it. In practice though this never happens anymore outside of games consoles (as far as I'm aware), because most consumer devices don't have the right hardware for it, and even if they did you can't do much hardware interaction inside attested code.

◧◩◪◨
4. funcDr+ib4[view] [source] 2023-07-20 15:06:39
>>mike_h+MK
> TC is value neutral so the FSF slurs don't make sense

Trusted computing is often used such that one might think, it implies the user can trust something (his computer). But it is the other way around. A service provider can trust a machine - that a user bought -- to not do what the user wants. That is misleading at best.

◧◩◪◨⬒
5. mike_h+vJ6[view] [source] 2023-07-21 08:24:57
>>funcDr+ib4
There are really only two ways TC is used in practice in today's economy:

1. Clients verifying cloud VMs. The "user" in this case is not the same person as "his" in "his computer".

2. Games consoles being verified by PS/Xbox online services. In this case the users are in effect verifying each other, because part of why they need such tough security is to stop online gaming being wrecked by cheaters.

At a stretch you could talk about credit card chips and the ATM network as (3) but that's far enough away from general computing that it doesn't really count.

In both cases these are firmly pro-consumer use cases. Unless you want to pirate or cheat in gaming of course, but there's plenty of users who don't want to subsidize your fun with their own suffering, so they're happy to rely on TC to stop that. It's a big part of why console gaming dominates PC gaming. It's wrong for the FSF to imply all those people are brainwashed dupes because they have a different value system to Richard Stallman.

◧◩◪◨⬒⬓
6. funcDr+qd7[view] [source] 2023-07-21 13:01:20
>>mike_h+vJ6
> There are really only two ways TC is used in practice in today's economy:

Isn't that exactly the case, because the idea of using TC in every PC produced a huge backlash?

[go to top]