zlacker

[parent] [thread] 85 comments
1. fleven+(OP)[view] [source] 2022-07-29 23:59:09
Unpopular opinion:

Hardware-based attestation of the running software is an important security feature, especially in a world where data leaks and identity theft are rampant. Let's say I'm a healthcare provider, and I'm about to send sensitive medical data to a third party vendor. Wouldn't you prefer that this data only be able to be decrypted by a computer that can prove to the world it booted a clean OS image with all the latest security patches installed?

If the vendor wants to install some self-built OS that they trust on their computer and not update it for 5 years, that's their business, but I may not want to trust their computer to have access to my personal data.

Remote attestation gives more control to the owners of data to dictate how that data is processed on third-party machines (or even their own machines that may have been compromised). This is useful for more than just DRM.

replies(18): >>gjsman+z >>gerdes+h1 >>SCHiM+j1 >>userbi+C1 >>nouser+c3 >>csomar+s3 >>pdonis+16 >>grishk+C6 >>thrown+U6 >>matheu+W6 >>nitwit+57 >>Schroe+W9 >>stefan+nb >>XorNot+0c >>novok+Oc >>rubatu+tf >>thwart+Wf >>chii+7i
2. gjsman+z[view] [source] 2022-07-30 00:06:39
>>fleven+(OP)
I actually don't disagree with you. As I mention in the article:

> I cannot say how much freedom it will take. Arguably, some of the new features will be “good.” Massively reduced cheating in online multiplayer games is something many gamers could appreciate (unless they cheat). Being able to potentially play 4K Blu-ray Discs on your PC again would be convenient.

However, I'm more worried about the questions the increased deployment of technology will bring, such as will Linux users be doomed to a CAPTCHA onslaught being the untrusted devices, or worse. Important questions that, unless raised, risk us just "going with the flow" until it is way too late.

replies(4): >>fleven+q1 >>matheu+i9 >>kmeist+9j >>richar+ak
3. gerdes+h1[view] [source] 2022-07-30 00:13:42
>>fleven+(OP)
"Hardware-based attestation of the running software is an important security feature"

I understand the mechanics in a "lies to children" way but who exactly is attesting what? Let's face it: MS isn't going to compensate me for a perceived flaw in ... why am I even finishing this sentence?

I recently bought some TPM 2.0 boards for my work VMware hosts so I could switch on secure boot and "attestation" for the OS. They are R630s which have a TPM 1.2 built in but a 2.0 jobbie costs about £16.

I've ticked a box or three on a sheet but I'm not too sure I have significantly enhanced the security of my VMware cluster.

replies(2): >>nouser+Q3 >>fleven+q4
4. SCHiM+j1[view] [source] 2022-07-30 00:14:14
>>fleven+(OP)
Even if we assume that the features will be basically unbreakable your world will still end up looking like the following.

Entities (ab)using remote attestation in order of 'screws over those below them':

Government > Cyber criminal groups > Large organizations > Normal people.

Do you want to live in a world where a large corp can dictate which $VERSION of $APPROVED_SOFTWARE you should be running? I think fundamentally it's just not the direction we should be going. I don't actually doubt that proper remote attestation eventually would be possible, but before then it will be possible to bypass it in countless ways. Probably eventually you'd end up with only a single software stack, assumed to be flawlessly secure.

I think, luckily, this will severely limit the usability of the technology that can work in this way. Developing for this stack will be a pain, the machine will have all sorts of super annoying limitations: can't use that display the driver is not vetted, can't use that USB webcam it might have DMA, etc. That will hopefully harm the uptake of such technologies.

Like often in tech remote attestation in your case is a technical fix for a social problem. If the problem is sharing sensitive data with institutions you don't trust then you need to build that trust, or transform the institutions so that they can be trusted. Transparency, laws, oversight, that type of stuff.

replies(1): >>Schroe+ba
◧◩
5. fleven+q1[view] [source] [discussion] 2022-07-30 00:14:52
>>gjsman+z
Unfortunately, it does seem likely that many services will require that your machine run a kernel/web browser signed by an entity they trust before they give you access to what they consider sensitive data. That will suck for those of us who want to build our own kernels/web browsers and use that software to interact with sensitive data from large corporations, but that's their choice to make (IMHO). And it's my choice not to use their service.
replies(2): >>est31+53 >>smolde+yf
6. userbi+C1[view] [source] 2022-07-30 00:17:31
>>fleven+(OP)
Wouldn't you prefer that this data only be able to be decrypted by a computer that can prove to the world it booted a clean OS image with all the latest security patches installed?

No.

Contrarily unpopular opinion: You cannot own data except what resides on your own property. Once you give someone a copy, it is theirs to do with as they wish. They may tell you what they will and will not do, but it is entirely on you to trust them.

...and that's the peril of things like remote attestation and other "zero trust" crap. They replace the nuanced meaning of trust that holds society together (and has literally done so since the beginning of life) with absolutes enforced by an unrelenting machine controlled by some faceless bureaucracy which is also partly under the command of the government. There should already be enough dystopian sci-fi to convince everyone why that is a really bad idea.

replies(5): >>fleven+72 >>game-o+B2 >>hedora+Q6 >>matheu+W7 >>dap+wc
◧◩
7. fleven+72[view] [source] [discussion] 2022-07-30 00:23:25
>>userbi+C1
Right, but if they aren't going to follow best security practices and prove it (via a signed a hardware attestation of the running software that includes the transport key they want me to use to send them the data), then I'm not going to send them the data. That's my choice.
replies(3): >>msla+D2 >>unionp+Y4 >>nradov+c8
◧◩
8. game-o+B2[view] [source] [discussion] 2022-07-30 00:28:46
>>userbi+C1
Strongly agree.

We've already seen shades of this in banking. After chips were added to credit cards, people started having their chargebacks denied because "our records show the card was physically present" (even if the charge originated in another country)

How long until companies try to deny responsibility for data leaks because "our records show Windows was fully up-to-date and secure"

replies(2): >>supert+u6 >>mike_h+yX
◧◩◪
9. msla+D2[view] [source] [discussion] 2022-07-30 00:29:08
>>fleven+72
> if they aren't going to follow best security practices and prove it (via a signed a hardware attestation of the running software that includes the transport key they want me to use to send them the data)

You can mandate whatever remote attestation you want, and they'll follow whatever security practices they damn well feel like and you can't do a damn thing about it. So, you've given up your ability to run software that doesn't spy on you, and they're operating business as usual because they don't have a single goddamn reason to care what you think remote attestation mean in the real world.

◧◩◪
10. est31+53[view] [source] [discussion] 2022-07-30 00:34:42
>>fleven+q1
Often it's not your choice, when e.g. all banking apps have this requirement, and banks require an app to allow you access to your account at all. Or when it's a health service because the data is so "sensitive". Today, platforms like Discord and Twitter very often want your phone number despite not having any technological need for it. Will they in the future require this thing as well so that they are sure that you are not using ad blockers? Will you be unable to communicate with most of society through these "optional" services if you don't have one of these "trusted computing" devices?

This is way more than just about not watching movies in 4k that you could also pirate. This is about turning people who don't have "trusted computing" devices that track every behaviour of theirs into societal outcasts.

replies(2): >>fleven+n3 >>kmeist+ej
11. nouser+c3[view] [source] 2022-07-30 00:36:36
>>fleven+(OP)
> Let's say I'm a healthcare provider, and I'm about to send sensitive medical data to a third party vendor.

Let's say I'd like mandatory disclosure on shenanigans like that, so I can avoid this healthcare provider.

replies(3): >>revolv+P4 >>blkfnw+c5 >>Rebelg+p5
◧◩◪◨
12. fleven+n3[view] [source] [discussion] 2022-07-30 00:39:57
>>est31+53
So how do you solve this? Get the government to ban CPU vendors from implementing hardware-rooted remote attestation? I can assure you that this technology is used inside corporations for their own internal security, and such a ban would weaken our ability to survive a cyberwar.
replies(2): >>est31+P8 >>matheu+x9
13. csomar+s3[view] [source] 2022-07-30 00:40:38
>>fleven+(OP)
> Wouldn't you prefer that this data only be able to be decrypted by a computer that can prove to the world it booted a clean OS image with all the latest security patches installed?

Yeah, because transferring that data into another machine is an impossible task.

That's the stupidest argument I heard today...

replies(1): >>fleven+B6
◧◩
14. nouser+Q3[view] [source] [discussion] 2022-07-30 00:45:02
>>gerdes+h1
"Attestation" of a VM is such a fraught concept... Isn't whole idea of virtualization, to outright lie to the "guest" operating system?

Yes, dear Windows, you're running on a dual-core Xeon Gold 6326 with i440BX chipset. Don't ask how this is possible, just trust me...

replies(2): >>Wowfun+N4 >>wmf+q5
◧◩
15. fleven+q4[view] [source] [discussion] 2022-07-30 00:52:36
>>gerdes+h1
Implemented properly, the idea is that you have a chain of certificates (rooted by the CPU vendor's public key) that can identify all the different bits of software that have executed on the machine, along with a ephemeral public key. The hardware guarantees that the associated private key can only be wielded by the software versions that the chain attested to. So when you initiate your TLS connection with this machine, you can validate the cert chain and understand exactly what software the machine is running, assuming that you trust the CPU vendor and all the versions of the software that were attested to.
replies(1): >>teaket+85
◧◩◪
16. Wowfun+N4[view] [source] [discussion] 2022-07-30 00:57:42
>>nouser+Q3
And so isn't this basically the flaw in the whole idea? You can always emulate a TPM. You can always boot with a stock kernel and have the host patch the memory afterwards. Software can try to detect whether it's running in a VM, but the VM can lie. Last I heard, blocking VMs didn't go so well when nVidia tried it.

Am I wrong about the effectiveness of this? I'll readily admit I don't understand most of the underlying tech here.

replies(2): >>no_tim+w5 >>taco99+C5
◧◩
17. revolv+P4[view] [source] [discussion] 2022-07-30 00:58:00
>>nouser+c3
Right?! It's telling that that's the use case: "what if we want to securely exchange The New Oil for some sweet cash, without the chance of some other asshole honing in on our racket or the little people hearing about it?"
◧◩◪
18. unionp+Y4[view] [source] [discussion] 2022-07-30 01:00:15
>>fleven+72
> best security practices

as defined by whom? Some government (which one) organization ?

This will end up making everything more ossified and less secure.

But also once that is in place, various organizations and goverments will be able to force you to use whatever spyware they want, in order for your attestation to go through.

replies(2): >>judge2+S5 >>mike_h+MX
◧◩◪
19. teaket+85[view] [source] [discussion] 2022-07-30 01:02:24
>>fleven+q4
> … assuming that you trust the CPU vendor and all the versions of the software that were attested to.

That’s a huge caveat.

You also cannot verify your trust is deserved, and that it will continue to be deserved, because such a system by its very nature must be opaque to untrusted parties (which means you).

replies(1): >>wmf+G5
◧◩
20. blkfnw+c5[view] [source] [discussion] 2022-07-30 01:02:41
>>nouser+c3
You already do have mandatory disclosure on shenanigans like that in the US. It's the boilerplate HIPAA agreement you sign when you first see a provider.

Good luck finding a provider that doesn't ship your sensitive medical data out to an EMR company though.

◧◩
21. Rebelg+p5[view] [source] [discussion] 2022-07-30 01:04:06
>>nouser+c3
Can you even do paper prescriptions any more? I've only had digital ones my entire adult life.

Quick edit to answer my own question: In my home state paper prescriptions are only legal in a few situations (if it's for an animal, glasses, justifiable emergencies). However in some parts of the country they're still possible. Even if I had a choice, I prefer the convenience of sending the data digitally- once you actual fill the paper prescription CVS or whoever is still gonna be able to glean sensitive medical info, so you're just delaying the inevitable.

replies(3): >>blkfnw+A5 >>rubatu+Df >>salawa+m61
◧◩◪
22. wmf+q5[view] [source] [discussion] 2022-07-30 01:04:10
>>nouser+Q3
The hardware attests the hypervisor, the hypervisor attests the OS, the OS attests the app, etc. It all works as long as you chain down to the unique key in hardware.
◧◩◪◨
23. no_tim+w5[view] [source] [discussion] 2022-07-30 01:05:03
>>Wowfun+N4
>Am I wrong about the effectiveness of this?

Partially. For online attestation you'd be missing the most important part. The vendor signed keypair that is insanely hard to extract from the device.

replies(1): >>traver+O6
◧◩◪
24. blkfnw+A5[view] [source] [discussion] 2022-07-30 01:05:46
>>Rebelg+p5
Yes, but they're inconvenient as can possibly be. Pharmacies here seem like they have digital prescriptions available the same day, while I almost always have to return another day when dropping off a paper prescription.
replies(1): >>supert+s8
◧◩◪◨
25. taco99+C5[view] [source] [discussion] 2022-07-30 01:05:51
>>Wowfun+N4
The emulated TPM will not contain the TPM manufacturer's private key that is used to sign responses.
replies(2): >>cesarb+Oe >>mindsl+i81
◧◩◪◨
26. wmf+G5[view] [source] [discussion] 2022-07-30 01:06:49
>>teaket+85
such a system by its very nature must be opaque to untrusted parties

No, you can attest to a completely open source system. Nobody's actually doing that, but it's possible. The private keys have to be secret and non-extractable, but that's it.

replies(3): >>teaket+26 >>blkfnw+z6 >>salawa+P51
◧◩◪◨
27. judge2+S5[view] [source] [discussion] 2022-07-30 01:08:53
>>unionp+Y4
Best security practices as defined by Microsoft, but if I want, I can still create my own arbitrary security requirements and enforce them via software/audits.
replies(3): >>userbi+Kd >>patrak+ql >>dzikim+Wt
28. pdonis+16[view] [source] 2022-07-30 01:10:00
>>fleven+(OP)
> Let's say I'm a healthcare provider, and I'm about to send sensitive medical data to a third party vendor. Wouldn't you prefer that this data only be able to be decrypted by a computer that can prove to the world it booted a clean OS image with all the latest security patches installed?

No, because this still doesn't mean my data is secure. A human can still go into the third party vendor's system and see my data, and if that human wants to steal it, they can. No amount of remote attestation will prevent that.

> Remote attestation gives more control to the owners of data to dictate how that data is processed on third-party machines

Oh, really? So every time my health care provider wants to send my data to a third party, a remote attestation confirmation box will pop up on my phone so I can say yes or no, or ask more questions about the third party vendor's computers?

Ultimately the problem here is trust, and trust is a social problem, and as the saying goes, you can't use technology to solve a social problem. But you sure can pretend to be using technology to "solve" a problem in order to get people to give up more and more control over their devices.

◧◩◪◨⬒
29. teaket+26[view] [source] [discussion] 2022-07-30 01:10:03
>>wmf+G5
The fundamental security mechanism upon which the entire system hinges is opaque.
◧◩◪
30. supert+u6[view] [source] [discussion] 2022-07-30 01:14:19
>>game-o+B2
The error in logic there is that chip usage is stronger proof, but not infallible. How was the account started? Cards can be stolen from mailboxes and purses. Some smartcard manufacturers have poor key handling security; Gemalto emailed keys before writing them to SIMs. Some EMV chips were vulnerable to replay attacks due to shoddy implementation.

This is why consumer protection laws are more important than any technical means of financial security. Having a super duper hardware wallet to store your cryptocurrency doesn't negate the irreversible nature of transactions.

Raw data is even harder to secure than money. Money in a banking system can be clawed back or frozen. Data can't be un-leaked.

◧◩◪◨⬒
31. blkfnw+z6[view] [source] [discussion] 2022-07-30 01:15:27
>>wmf+G5
Plenty of people do it. I use tpm2-totp for it. There is a key sealed in my TPM, that will only unseal for known boot stacks (firmware/bootloader/kernel). I have the same key stored in my Yubikey's TOTP application. After boot I can verify my stack by comparing a TOTP code generated by my Yubikey with one generated by the TPM.

Caveat is that security only extends into the kernel image, so for my use case I embed the initrd in the kernel image and have all the filesystems and swap on a dm-crypt volume.

I also have to unseal and reseal when performing upgrades of the initramfs and above, but I'm fine with that.

replies(1): >>jstanl+is
◧◩
32. fleven+B6[view] [source] [discussion] 2022-07-30 01:15:40
>>csomar+s3
Admittedly it is a simplified scenario for the sake of argument. You'd need to have a full attestation of everything that controls access to the data (including ACLs) to get much of a guarantee.

But in a future world it's not hard to imagine the vendor software running in some sort of SGX-like environment that is very difficult to manually extract the data from.

33. grishk+C6[view] [source] 2022-07-30 01:15:50
>>fleven+(OP)
> Wouldn't you prefer that this data only be able to be decrypted by a computer that can prove to the world it booted a clean OS image with all the latest security patches installed?

I trust myself more than I trust anyone or anything else. It's as simple as that. I don't even slightly trust Microsoft, Google, or Apple.

Your logic is built on an invalid premise that these companies can, in fact, be trusted.

> Remote attestation gives more control to the owners of data to dictate how that data is processed on third-party machines (or even their own machines that may have been compromised).

This is exactly what I want to avoid. It's my device. It should only ever serve me, not anyone else, including its manufacturer and/or OS developer. It should not execute a single instruction that isn't in service of helping me achieve something.

Also, the concept of ownership can simply not be applied to something that does not obey the physical conservation law, i.e. can be copied perfectly and indefinitely.

replies(1): >>fleven+77
◧◩◪◨⬒
34. traver+O6[view] [source] [discussion] 2022-07-30 01:17:34
>>no_tim+w5
I'll extract them for 40k a pop all day long. I've got the hardware in storage from an old contract. Side channel power analysis is fun.
replies(1): >>no_tim+e8
◧◩
35. hedora+Q6[view] [source] [discussion] 2022-07-30 01:18:01
>>userbi+C1
Strongly agree, but even if you are wrong on all those points, I still don't want to be forced to run the same exact monoculture software as everone else.

Good luck getting your x86-64 windows kernel + chrome JavaScript exploit chain to run on my big endian arm 64 running Linux and Firefox.

(Also, the existence of competition like that regularly forces all the alternatives to improve.)

36. thrown+U6[view] [source] 2022-07-30 01:18:26
>>fleven+(OP)
>Hardware-based attestation of the running software is an important security feature, especially in a world where data leaks and identity theft are rampant. Let's say I'm a healthcare provider, and I'm about to send sensitive medical data to a third party vendor. Wouldn't you prefer that this data only be able to be decrypted by a computer that can prove to the world it booted a clean OS image with all the latest security patches installed?

I'd prefer it to not be run a computer which has already been compromised with a UEFI rootkit which is what trusted computing has gotten us so far.

replies(1): >>dap+3d
37. matheu+W6[view] [source] 2022-07-30 01:18:57
>>fleven+(OP)
All cryptography is important. Hardware attestation is no exception. The problem is who's using it, who owns the keys, who's exploiting who.

It's totally fine if it's used to empower and protect us, normal people. If we can use this with our own keys to cryptographically prove that our own systems haven't been tampered with, it's not evil, it's amazing technology that empowers us.

What we really don't need is billion dollar corporations using cryptography to enforce their rule over their little extra-legal digital fiefdoms where they own users and sell access to them to other corporations or create artificial scarcity out of infinite bits. Such things should be straight up illegal. I don't care how much money it costs them, they shouldn't be able to do it.

The problem is we have chip makers like Intel and AMD catering to the billion dollar corporation's use case instead of ours. They come up with technology like IME and SGX. They sell us chips that are essentially factory-pwned by mega-corps. My own computer will not obey me if it's not in some copyright owner's interest to do so and I can't override their control due to their own cryptographic roadblocks. Putting up these roadblocks to user freedom should be illegal.

replies(2): >>userbi+Px >>mike_h+1Y
38. nitwit+57[view] [source] 2022-07-30 01:20:50
>>fleven+(OP)
You'd only be verifying the machine you sent the data to was up to date, but that is likely to be a file server or router of some sort. You'd need to validate the entire network that will touch the data.

While it might be theoretically possible to assert the whole network is up to date, the hospitals will definitely fail that check. There's all sorts of equipment the hospitals can't update for various reasons, such as aging radiology machines.

◧◩
39. fleven+77[view] [source] [discussion] 2022-07-30 01:20:52
>>grishk+C6
If I want to buy a device that can generate a proof I can share with others to increase their trust in me, you shouldn't be able to stop me. Implemented properly, these machines can still boot whatever custom software you want; you don't have to share the proof of what booted with anyone.
replies(2): >>grishk+z7 >>nulbyt+bW
◧◩◪
40. grishk+z7[view] [source] [discussion] 2022-07-30 01:26:54
>>fleven+77
I'm not saying that secure boot is inherently a bad idea. It's a good idea but only if all signing keys are treated equally. Right now, they aren't. AFAIK modern motherboards, those of them that use UEFI, come with Microsoft keys preloaded — and that preferential treatment is the part that's not okay. In an ideal world, all devices that support secure boot should come with a completely empty keystore so that you could either trust Microsoft keys or generate your own key pair and trust that. Possibly re-sign the Windows bootloader with it even.

It's much, much worse with mobile devices. You can re-lock the bootloader on a Pixel with your custom key, but you still can't touch TrustZone and you'll still get a warning on boot that it's not running an "official" OS build.

replies(1): >>mindsl+h71
◧◩
41. matheu+W7[view] [source] [discussion] 2022-07-30 01:29:18
>>userbi+C1
> You cannot own data except what resides on your own property. Once you give someone a copy, it is theirs to do with as they wish.

Completely agree. These outdated notions of information ownership are destroying free computing as we know it. Everything "hacker" stands for is completely antithetical to such notions.

◧◩◪
42. nradov+c8[view] [source] [discussion] 2022-07-30 01:32:24
>>fleven+72
In the US healthcare industry, providers are legally mandated to share patient data with certain other organizations. You don't have a choice.
replies(1): >>salawa+G41
◧◩◪◨⬒⬓
43. no_tim+e8[view] [source] [discussion] 2022-07-30 01:32:34
>>traver+O6
lol If I had USA money I'd go for it for 40k.

I've read once about the hardware tricks DRM dongles use in the silicon itself. Doesn't sound like a 40 job :^)

◧◩◪◨
44. supert+s8[view] [source] [discussion] 2022-07-30 01:35:25
>>blkfnw+A5
Electronic prescriptions being faster is only faster if they prioritize them over paper orders. The bottleneck in speed is waiting for the pharmacist to verify it. If they're not too busy the process can run to completion while you travel from the doctor's office. If they're busy due to short staffing or heavy demand your escript is part of a several hundred long queue.
replies(1): >>petre+gp
◧◩◪◨⬒
45. est31+P8[view] [source] [discussion] 2022-07-30 01:40:03
>>fleven+n3
Using this technology to secure non-private infrastructure, including corporate networks, makes total sense. And yes, it has some helpful properties to secure that infrastructure. But don't be mistaken, configuration mistakes still exist, as do zero days. Attestation helps against persistence, and this is valuable, but it's only one link in the chain.

That being said, extending it to everyone in a way that curtails individual control of computing devices creates an environment that is dangerous in many ways. I don't want to be in a world where only "approved" software is allowed on my computer or something. This can get wrong really quickly, and a lot of the application of attestation technology for consumers is really just about removing their freedoms.

The place where the government should step in IMO is not to ban CPU vendors from implementing this, but to pass anti-discrimination laws, so ban companies from requiring remote attestation to unlock some specific feature. They should maybe endorse it, or be allowed to warn you, but they should still allow full access regardless.

For the B2B setting there are obvious dangers of monopoly abuse, here the government just needs to enforce existing laws. Microsoft dropping the requirement that the signing key for third parties has to be trusted is IMO a major antitrust violation.

◧◩
46. matheu+i9[view] [source] [discussion] 2022-07-30 01:48:40
>>gjsman+z
> Massively reduced cheating in online multiplayer games is something many gamers could appreciate (unless they cheat).

I have a rooted Android phone and I had to spend effort spoofing attestation in order to even launch some of my games which don't even have multiplayer. Allow me to be the one to tell you that I do not appreciate it.

I don't even care enough to cheat at these games but if I wanted to cheat it would be merely an exercise of my computer freedom which nobody has any business denying.

◧◩◪◨⬒
47. matheu+x9[view] [source] [discussion] 2022-07-30 01:52:02
>>fleven+n3
> Get the government to ban CPU vendors from implementing hardware-rooted remote attestation?

Get the government to regulate the corporations requiring it. Classify any attestation requirement as discrimination or something. They're excluding people without good reason.

48. Schroe+W9[view] [source] 2022-07-30 01:56:58
>>fleven+(OP)
> Let's say I'm a healthcare provider, and I'm about to send sensitive medical data to a third party vendor. Wouldn't you prefer that this data only be able to be decrypted by a computer that can prove to the world it booted a clean OS image with all the latest security patches installed?

I'd rather be able to access it without google or microsoft sticking their nose in.

I'd rather be able to combine it with my other data in whatever ways I see fit.

I'd rather be able to back it up in whatever way I see fit.

I'd rather be able to open it on a device that doesn't have a backdoor provided by the US government.

Because it's not microsoft'sor qualcomm's data, it's mine.

◧◩
49. Schroe+ba[view] [source] [discussion] 2022-07-30 02:02:58
>>SCHiM+j1
Unless you're including faang/ms in government, large organizations belong at the top of the list.

Who needs espionage or lobbying when you have an undetectable root shell on every computer in the country?

50. stefan+nb[view] [source] 2022-07-30 02:21:29
>>fleven+(OP)
This crap hasn't worked and it will never work. Console vendors have built much stronger things than remote attestation but you know one thing stays true: if you sign crappy vulnerable code, you are just signing any code.
51. XorNot+0c[view] [source] 2022-07-30 02:28:36
>>fleven+(OP)
The reality though is none of this is "secure" except by extensive, massive collusion and centralization in society - and such a thing is implicitly able to be used against the people as much as it might be used for them.

The only reason such hardware is secure is because the resources required to hack it are large.

Basically, a sane system would be: two parties exchange their own TPM keys which they generated on device themselves. They agree to a common set of measurements they will use with their TPMs to determine if they believe the systems are running normally. They then exchange data.

What's happening instead: a large company uses its market position to bake in it's own security keys, which the user can't access or change. They then use their market position to demand your system be configured a specific way that they control. Everyone else suborns to them because they're a big player and manufacturing TPMs is complicated. They have full control of the process.

The essential difference is that rather then two individuals establishing trust, and agreeing to protocols for it - secured with the aid of technology, instead one larger party seizes control by coercion, pretends it'll never do wrong, and allows people to "trust" each other as mediated by its own definition. Trust between individuals ceases to exist, because it's trust provided you're not betrayed by the middle-man.

Weirdly enough, this is actually a big god damn problem if you actually work for any organization that's going government or security work, because the actual processes of those places tend to be behind whether or not they believe large corporate providers doing things like this are actually doing them well enough, or can be trusted enough, to be a part of the process. So even if you're notionally part of "the system" it doesn't actually make anything easier: in an ideal world open-source security parts would enable COTS systems to be used by defense and government departments with surety because they'd be built from an end-user trust and empowerment perspective.

So even the notional beneficiaries tend to have problems because a security assessment ends up at "well we just have to trust Microsoft not to screw up" and while the Head of the NSA might be able call them up and get access, random state-level government department trying to handle healthcare or traffic data or whatever cannot.

◧◩
52. dap+wc[view] [source] [discussion] 2022-07-30 02:34:25
>>userbi+C1
It's not necessarily "you" vs. "someone else". You could be one person with two computers and want one computer to be able to attest to the other computer something about its software. (Imagine it's not two computers, but a thousand computers that are exposed to both physical and network attacks.)
53. novok+Oc[view] [source] 2022-07-30 02:38:50
>>fleven+(OP)
IMO the entire remote attestation is an obfuscated dance about who has root, control and ultimately ownership over devices.

If vendors were plain about it, "attestation" wouldn't be a big deal: you do not own the devices, we do, and you lease it from us, maybe for a one time fee.

But companies know it won't actually fly if your plain about it, ESPECIALLY with large corporations and governments who will outright refuse to buy your services or equipment for many key things if they are not the ultimate controllers of the machines for multiple reasons.

◧◩
54. dap+3d[view] [source] [discussion] 2022-07-30 02:42:06
>>thrown+U6
These primitives can be used to tell you when you're talking to a machine that has not been compromised with a UEFI rootkit.
replies(1): >>thwart+eg
◧◩◪◨⬒
55. userbi+Kd[view] [source] [discussion] 2022-07-30 02:49:47
>>judge2+S5
Microsoft. The same company which strongly pushes a spyware-filled, user-hostile OS. "best"? Really?

but if I want, I can still create my own arbitrary security requirements and enforce them via software/audits

Try doing that to your bank or whatever other large company you interact with...

replies(1): >>judge2+7e
◧◩◪◨⬒⬓
56. judge2+7e[view] [source] [discussion] 2022-07-30 02:56:48
>>userbi+Kd
You can't have your cake and eat it too. Everyone has agency, to decide who they interact with and who they give money to or, on the other side, who they sell products to/provide services to, and there are remarkably few exceptions to this rule (most based on things that the victim can't control). If a company wants to require you only use their products, or only use a allowlist of approved products, they can do that, just as you can decide not to use their services if they charge too much, perform unethical actions, or even if their company name contains the letter 'Y'.
replies(1): >>accoun+pB5
◧◩◪◨⬒
57. cesarb+Oe[view] [source] [discussion] 2022-07-30 03:08:58
>>taco99+C5
Which is why the comment which started this sub-thread mentioned buying extra physical TPM 2.0 chips. They contain the correct keys, and since they're external devices, it's trivial to lie to them, pretending to be the physical CPU doing a normal boot.

Of course, that only works until they start rejecting external TPM chips, and accepting only the built-in "firmware" TPMs found in more recent CPUs.

replies(1): >>wmf+Vm
58. rubatu+tf[view] [source] 2022-07-30 03:19:10
>>fleven+(OP)
No, that would be horrible for data portability, which is the reason many hospitals are locked into shitty EPR systems. If you are going to design a way to transfer data across healthcare providers, it better be as easy as sending a fax. Sheesh.
◧◩◪
59. smolde+yf[view] [source] [discussion] 2022-07-30 03:19:42
>>fleven+q1
It's not your choice. The choice will be taken away, and you will not have the choice to control your computer any longer.
◧◩◪
60. rubatu+Df[view] [source] [discussion] 2022-07-30 03:20:43
>>Rebelg+p5
Go to any emergency room here in Winnipeg and yeah they will send you home with a handwritten prescription.
61. thwart+Wf[view] [source] 2022-07-30 03:26:19
>>fleven+(OP)
Wouldn't you prefer that this data only be able to be decrypted by a computer that can prove to the world it booted a clean OS image with all the latest security patches installed?

This is a pretty bad example. The attack vector is rarely, if ever, the technical way the encrypted file is received or where it is decrypted. The attack vector is what happens after it's decrypted. You've given an encrypted file to a computer you've confirmed knows how to decrypt it "securely" (whatever that means). And after that, that clean OS image with all the latest security patches still enables, by design, the decrypted data to be used (read, manipulated, do whatever it is you sent it to them in the first place) and sent someplace else or copied to removable media.

◧◩◪
62. thwart+eg[view] [source] [discussion] 2022-07-30 03:29:57
>>dap+3d
Which is meaningless. Bad actors can use machines that have not been compromised with a rootkit.
63. chii+7i[view] [source] 2022-07-30 03:57:35
>>fleven+(OP)
i dont think the attestation mechanism is wrong, but that the ability to perform remote attestation is going to be abused to lock in consumers (even if in some circumstances there's improved security using it).
◧◩
64. kmeist+9j[view] [source] [discussion] 2022-07-30 04:11:58
>>gjsman+z
IMHO I don't see a CAPTCHA onslaught happening - if only because at that point the CAPTCHAs will be practically useless at stopping bots. They will just ban untrusted devices.

The current landscape of CAPTCHA technology is pretty bleak. It's pretty easy to use ML to learn and solve the early first-gen CAPTCHAs that just used crossed-out words. Google reCAPTCHA relies primarily on user data, obfuscation, and browser fingerprinting to filter out bots, but that only works because of (possibly misplaced) trust in Google. It falls back to an image recognition challenge (which hCaptcha uses exclusively) if you don't have a good data profile - which can also be solved by automated means.

I don't see desktop Linux being fully untrusted off the Internet, if only because Google won't let it happen. They banned Windows workstations internally over a decade ago and they are institutionally reliant upon Linux and macOS. What will almost certainly happen is that Linux will be relegated to forwarding attestation responses between Pluton, some annoying blob in Google Chrome, and any web service that does not want to be flooded with bots in our new hellscape of post-scarcity automation.

◧◩◪◨
65. kmeist+ej[view] [source] [discussion] 2022-07-30 04:14:02
>>est31+53
Discord and Twitter want your phone number to limit how many accounts you are allowed to sign up for.
replies(1): >>est31+Zn1
◧◩
66. richar+ak[view] [source] [discussion] 2022-07-30 04:27:26
>>gjsman+z
Another "doomsday scenario" is that a distinct market for FOSS hardware will arise, albeit much shittier hardware than what everyone else uses. Those users will become fringe and progressively isolated.
replies(1): >>accoun+oE5
◧◩◪◨⬒
67. patrak+ql[view] [source] [discussion] 2022-07-30 04:49:17
>>judge2+S5
Best security practices as defined by Microsoft = "You can't have a computer if your country is under US sanctions". Important word: US, a single country. I don't want to punch such a huge hole in any of my systems.
◧◩◪◨⬒⬓
68. wmf+Vm[view] [source] [discussion] 2022-07-30 05:18:30
>>cesarb+Oe
Yeah, Pluton "fixes" this because it's inside the CPU.
◧◩◪◨⬒
69. petre+gp[view] [source] [discussion] 2022-07-30 05:53:24
>>supert+s8
What's stopping them from making a cert and encoding it in a QR code? It was perfectly possible with covid vaccination certificates.
replies(1): >>supert+pK
◧◩◪◨⬒⬓
70. jstanl+is[view] [source] [discussion] 2022-07-30 06:31:08
>>blkfnw+z6
> After boot I can verify my stack by comparing a TOTP code generated by my Yubikey with one generated by the TPM.

But if you're not sure whether the system booted cleanly, then it might be compromised. If it's compromised couldn't your tools simply lie about the codes generated by both the TPM and the Yubikey so that they always match?

◧◩◪◨⬒
71. dzikim+Wt[view] [source] [discussion] 2022-07-30 06:54:46
>>judge2+S5
SP500 corp I often work with has security department filled with mindless drones, who say things like "regular enforced passwords changes are well regarded best practice".

You almost certainly use software that calls their server at some point. Hope you will enjoy their vision of security. I'm moving into the woods if they can define how my _personal_ computer behaves.

◧◩
72. userbi+Px[view] [source] [discussion] 2022-07-30 07:59:16
>>matheu+W6
The problem is who's using it, who owns the keys, who's exploiting who.

The governments know this all too well; that's why they've been trying to ban cryptography, and it was (and I believe still is in many cases) classified as a munition.

◧◩◪◨⬒⬓
73. supert+pK[view] [source] [discussion] 2022-07-30 11:15:13
>>petre+gp
Verify as in check for drug interactions and catch any medication errors by the physician.
◧◩◪
74. nulbyt+bW[view] [source] [discussion] 2022-07-30 13:28:41
>>fleven+77
You are correct, but I think that misses the point: Neither you nor I should be forced to buy only devices that run specified software as determined by a third-party. You are making this out to be a choice, that if it's available and you want it, you should be able to buy it. However, the worry is not over a choice to buy such devices, but over a mandate that only such devices be available and no others.
◧◩◪
75. mike_h+yX[view] [source] [discussion] 2022-07-30 13:38:09
>>game-o+B2
That seems to be an argument for damned if you do, damned if you don't. Yes, people need some incentive for deploying security upgrades and being able to say "we are sure it wasn't us" in disputes is part of that incentive. Otherwise why bother? If people get treated the same whether they made a genuine good faith effort to be secure, or do nothing, then you're just rewarding the companies that ignored security to focus on other things.
◧◩◪◨
76. mike_h+MX[view] [source] [discussion] 2022-07-30 13:40:26
>>unionp+Y4
"as defined by whom? Some government (which one) organization ?"

As defined by the user.

RA doesn't care what software you run. In fact RA is better supported by Linux than any other OS! And, although the discussion in this thread is about RA of entire machines, that's actually pretty old school. Modern RA is all about attesting the tiniest slice of code possible, hence the "enclave" terminology. The surrounding OS and infrastructure doesn't get attested because it can be blinded with encryption. This is beneficial for both sides. I don't actually necessarily care how you configure your OS or even if it's up to date with security patches, if the security model treats the entire OS as an adversary, which is how Intel SGX works. You just attest the code inside the enclave and I send/receive encrypted messages with it.

◧◩
77. mike_h+1Y[view] [source] [discussion] 2022-07-30 13:42:09
>>matheu+W6
But that's not a good argument because SGX isn't something that empowers the Big Guys over the Little Guys. In fact it's the other way around - they took it out of their consumer chips and now it's only found in their server class chips. So companies can create RA proofs and send them to users, but not the other way around.
◧◩◪◨
78. salawa+G41[view] [source] [discussion] 2022-07-30 14:43:24
>>nradov+c8
They actually aren't. The only reason that is necessitated is A) medicare/medicaid integration is strongly predicated on EMR, and our damn insurance model is cripplingly dependent in it.

There's nothing that keeps a medical provider from going old school.

Unless I'm completely overlooking something... It may have snuck in with ACA.

replies(1): >>nradov+lz1
◧◩◪◨⬒
79. salawa+P51[view] [source] [discussion] 2022-07-30 14:52:42
>>wmf+G5
Yes. Intel is willing to lend me all the equipment and logic analyzers I need to analyze their products, access to their internal design docs, access to their engineering team to answer my questions, etc, etc...

Do you realize how daft and unrealistic your assertion is?

Tell ya what. You get Broadcom, Intel, AMD, Nvidia, etc... to go full transparent, and we'll talk.

◧◩◪
80. salawa+m61[view] [source] [discussion] 2022-07-30 14:57:49
>>Rebelg+p5
Yes! You can!

And in fact, if your provider is doing ePrescribing, odds are they are contributing to supporting a Monopoly by SureScriots who has cornered the market emwith anti-competitive business practices!

DEA still issues serialized paper prescription pads.

https://www.ftc.gov/news-events/news/press-releases/2019/04/...

Everytime an ePrescription goes over the wire, this one weird company based out of Virginia is likely shotgunning your personal info as collected by PBM's/health insurers between all parties involved, (with the obligatory copy for themselves, probably "anonymized and repackaged for monetizable exposure to research groups), and in the contractual terms requiring that people in the network not make arrangements with anyone else for the service.

As a common victim of the perniciousness of this arrangement. I'm more than familiar with how this nonsense goes.

◧◩◪◨
81. mindsl+h71[view] [source] [discussion] 2022-07-30 15:05:09
>>grishk+z7
This logic works for software signing, but not remote attestation. For remote attestation, the "tamper-proof-ness" is the root of the trust chain, and the signing keys are individually baked into the specific piece of hardware and not controlled by a third party. You seem to be hoping that we can disrupt that chain of trust by having manufacturers not record the public keys associated with each piece of hardware (such that individuals could create their own signing keys on open hardware), but that's just not going to happen.
◧◩◪◨⬒
82. mindsl+i81[view] [source] [discussion] 2022-07-30 15:11:26
>>taco99+C5
nit: the TPM contains its own internally-generated private key. That private key never leaves the TPM, and has nothing intrinsic to the manufacturer.

The manufacturer then signs the public portion of that TPM key, creating the ability for everyone to assert that said key was generated internal to their hardware (and thus couldn't be used by an emulator).

You yourself could also sign the public portion of the TPM key, or even generate a new one and sign it, but that wouldn't affect the perverse incentive generated by the manufacturer's assertion. It would just enable you to assert that you trust the TPM key is internal to the TPM without trusting the manufacturer's records.

We're dealing with something like the dual of software signing here.

◧◩◪◨⬒
83. est31+Zn1[view] [source] [discussion] 2022-07-30 17:06:54
>>kmeist+ej
That's only part of it, Twitter is also in the ad business, and in the ad industry, phone numbers are used as identifiers to correlate users between datasets.

If it's just about limiting access, Cloudflare imposes a similar limitation of number of accesses you can have to a website via remote attestation. I think once remote attestation becomes more prevalent, it might become useful in the ad business too, e.g. to prevent you from using ad blockers, or similar things.

◧◩◪◨⬒
84. nradov+lz1[view] [source] [discussion] 2022-07-30 18:23:11
>>salawa+G41
Yes you are overlooking a variety of more recent federal laws and associated interoperability regulations, some of which apply even to providers that only accept direct payments from patients and don't bill third-party payers (insurers).

The basic guiding principle in force since HIPAA in 1996 is that patients, not providers, control access to their medical records regardless of whether those are stored on paper or in an EHR. If the patient authorizes sharing those records with another healthcare organization then the provider can charge a small fee for that service but they can't introduce additional spurious technical requirements on the receiving system.

◧◩◪◨⬒⬓⬔
85. accoun+pB5[view] [source] [discussion] 2022-08-01 11:36:01
>>judge2+7e
Corporations are an artificial construct that we as a society let exist. We can decide to add additional restrictions to that existence like requiring them to not discriminate based on what software you run on your own devices.
◧◩◪
86. accoun+oE5[view] [source] [discussion] 2022-08-01 11:59:11
>>richar+ak
This is already the case for FOSS communications software and social networks.

In the past you culd use your FOSS client to commicate with your ICQ, AIM, MSN Messenger-using friends. Today, not using the official client will likely get you banned from the network.

[go to top]