zlacker

[return to "Google Web Environment Integrity Is the New Microsoft Trusted Computing"]
1. Knee_P+lp[view] [source] 2023-07-27 06:31:08
>>neelc+(OP)
There is a freedom problem, there is a hardware problem and there is a social problem.

The freedom problem is this: you will not be able to roll your own keys.

This is probably the biggest nail in the coffin for a ton of computers out there. In theory you could simulate via software the workings of a TPM. If you built a kernel module the browser would have no real way of knowing if it sent requests to a piece of hardware or a piece of software. But the fact that you would have to use Microsoft's or Apple's keys makes this completely impossible.

The hardware problem is this: you will not be able to use older or niche/independent hardware.

As we established that software simulation is impossible, this makes a ton of older devices utter e-waste for the near future. Most Chromebooks themselves don't have a TPM, so even though they are guaranteed updates for 10 years how are they going to browse the web? (maybe in that case Google could actually deploy a software TPM with their keys since it's closed source). I have a few old business laptops at home that have a 1.X version of the TPM. In theory it performs just as well as TPM 2.X, but they will not be supported because, again, I will not be able to use my own keys.

Lastly there is the social problem: is DRM the future of the web?

Maybe this trusted computing stuff really is what the web is bound to become, either using your certified TPM keys or maybe your Electronic National ID card or maybe both in order to attest the genuineness of the device that is making the requests. Maybe the Wild West era of the web was a silly dream fueled by novelty and inexperience and in the future we will look back and clearly see we needed more guarantees regarding web browsing, just like we need a central authority to guarantee and regulate SSL certificates or domain names.

◧◩
2. raxxor+GM[view] [source] 2023-07-27 09:46:02
>>Knee_P+lp
The wild west internet did perform perfectly. There are some problems here and there that could be improved. None of them are addressed by suggestion like this. This is for control and market reach, nothing else. Secure boot was as well. Evil maid problem is at least believable in a corporate context. These suggestions are just fluffy crap.
◧◩◪
3. kahncl+Xg1[view] [source] 2023-07-27 13:21:25
>>raxxor+GM
Really? Spam, scams, seo trash, bots and AIs, are utterly rampant.

I don’t want Google and Microsoft to have the keys to the kingdom, but on the other hand, I really want a way to know that I’m having genuine interactions with real people.

I wish government was getting more involved here.

◧◩◪◨
4. vetina+oy1[view] [source] 2023-07-27 14:28:55
>>kahncl+Xg1
It won't solve any of these problem.

But you will have to use hardware and software from approved vendors.

◧◩◪◨⬒
5. mike_h+252[view] [source] 2023-07-27 16:34:50
>>vetina+oy1
It can (that's why it's being pursued) and that, ironically enough, could even empower decentralized and P2P networks. Hear me out.

If you look at the history of the internet it's basically a story of decentralized protocols with a choice of clients being outcompeted by centralized services with a single client, usually because centralized services can control spam better (+have incentives to innovate etc, it's not just one issue).

Examples: USENET -> phpBB -> reddit, IRC -> Slack, ISP hosted email -> Gmail -> Facebook Messenger, SMS -> WhatsApp/iMessage, self-hosted git -> GitHub.

The reason spam kills decentralized systems is that all the techniques for fighting it are totally ad-hoc security-through-obscurity tricks combined with large dollops of expensive Big Data and ML processing, all handled by full time teams. It's stuff that's totally out of reach for indy server hosters. Even for the big guys it frequently fails!

Decentralized networks suffer other problems beyond spam due to their reliance on peers being trusted. They're fully open to attack at all times, making it risky and high effort to run nodes. They're open to obscure app-specific DoS attacks. They are riddled with Sybil attacks. They leak private data like sieves. Many features can't be implemented at all. Given all these problems, most users just give up and either outsource hosting or switch to entirely centralized services.

I used to work on the Gmail spam team, and also Bitcoin, so I have direct experience of the problems in both contexts.

Remote attestation (RA) isn't by itself enough to fix these problems, but it's a tool that can solve some of them. Consider that if USENET operators had the ability to reliably identify clients, then USENET would probably have lasted a fair bit longer. Servers wouldn't have needed to make block/allow decisions themselves, they could have simply propagated app identity through the messages. Then you could have killfiled programs as well as people. If SpamBot2000 shows up and starts flooding groups, one command is all it takes to wipe out the spam. Where it gets trickier is if someone releases an NNTP client that has legit users but which can be turned into a spambot, like via scripting features. At that point users would have to make the call themselves, or the client devs would need to find a way to limit how much damage a scripted client can do. So the decision on what is or is not "approved" would be in the hands of the users themselves, in that design.

The above may sound weird, but it's a technique that allows P2P networks with client choice to be competitive against centralised alternatives. And it's worth remembering that for all the talk of the open web and maybe the EU can do this or that, Facebook just did the most successful social network launch in history as a mobile/tablet only app that blocks the EU. A really good reason to not offer a web version is because mobile only services are much easier to defend against spam, again, because mobiles can do RA and browsers cannot. So the web is already losing in this space due to lack of these tools. Denying the web this sort of tech may seem like a short term win but just means that stuff won't be served to browsers at all, and nor will P2P apps that want to be accessible from desktops be able to use it either.

Anyway it's all very theoretical, because at this time Windows doesn't have a workable app-level RA implementation, so it's mobile-only for now anyway (Linux can do it between servers in theory, but not really on the desktop).

◧◩◪◨⬒⬓
6. vetina+BV2[view] [source] 2023-07-27 20:08:02
>>mike_h+252
> It can (that's why it's being pursued)

No, it can't -- see bellow; there's also no quantitative objective stated or communicated. Hence, it is not controllable, whether it achieved the stated objective or not. What would happen, if it doesn't achieve it? Nothing, because it was not promised clearly enough, just in some vague way.

But it happens to achieve different goal -- for example, even more concentrating the control over general computing into fewer hands.

Would it be rolled back, if it doesn't achieve the stated goal? Of course not; it will achieve the hidden ("it just happened, who could ever know, pinky swear") goal, and that's important. Not the pretend-goals that was used to sell it to the general public.

Now, why it won't achieve the stated goals: because spam is problem also with closed systems. Ever got a junk call? Users use only "approved" devices, and even if the system can put limits on the source, it also limits how the destination can protect itself. The important thing with spam, scams, etc. is, what whenever there is a possibility to make money, the scammers will find a way. Even with low-tech approach (like hire a bunch of human operators of the approved machines). They weren't stopped even when what they did was illegal, why do you think RA achieve what the law didn't? To make things worse, the closed nature made it more difficult for the victims to save evidence of the spam, scam.

So of course it won't reduce the scams. But it will make the situation worse for us all. And web losing to proprietary platforms? It will certainly lose, when it is turned into one of the proprietary platforms.

[go to top]