zlacker

[return to "I read the federal government’s Zero-Trust Memo so you don’t have to"]
1. static+ua[view] [source] 2022-01-27 15:56:56
>>EthanH+(OP)
This is pretty incredible. These aren't just good practices, they're the fairly bleeding edge best practices.

1. No more SMS and TOTP. FIDO2 tokens only.

2. No more unencrypted network traffic - including DNS, which is such a recent development and they're mandating it. Incredible.

3. Context aware authorization. So not just "can this user access this?" but attestation about device state! That's extremely cutting edge - almost no one does that today.

My hope is that this makes things more accessible. We do all of this today at my company, except where we can't - for example, a lot of our vendors don't offer FIDO2 2FA or webauthn, so we're stuck with TOTP.

◧◩
2. c0l0+IH[view] [source] 2022-01-27 18:23:10
>>static+ua
I think 3. is very harmful for actual, real-world use of Free Software. If only specific builds of software that are on a vendor-sanctioned allowlist, governed by the signature of a "trusted" party to grant them entry to said list, can meaningfully access networked services, all those who compile their own artifacts (even from completely identical source code) will be excluded from accessing that remote side/service.

Banks and media corporations are doing it today by requiring a vendor-sanctioned Android build/firmware image, attested and allowlisted by Google's SafetyNet (https://developers.google.com/android/reference/com/google/a...), and it will only get worse from here.

Remote attestation really is killing practical software freedom.

◧◩◪
3. seibel+9I[view] [source] 2022-01-27 18:25:33
>>c0l0+IH
Reproducible builds are a thing, I don't know how widespread they are. I know the monero project has that built in so everyone compiles the exact same executable regardless of environment, and can verify the hash against the official version https://github.com/monero-project/monero
◧◩◪◨
4. nybble+GO[view] [source] 2022-01-27 18:54:33
>>seibel+9I
Reproducible builds allow the user of the software to verify the version that they are using or installing. They do not, by themselves, allow the sort of remote attestation which would permit a service to verify the context for authentication—the user, or a malicious actor, could simply modify the device to lie about the software being run.

Secure attestation about device state requires something akin to Secure Boot (with a TPM), and in the context of a BYOD environment precludes the device owner having full control of their own hardware. Obviously this is not an issue if the organization only permits access to its services from devices it owns, but no organization should have that level of control over devices owned by employees, vendors, customers, or anyone else who requires access to the organization's services.

◧◩◪◨⬒
5. Initia+5X[view] [source] 2022-01-27 19:29:20
>>nybble+GO
> no organization should have that level of control over devices owned by employees, vendors, customers, or anyone else who requires access to the organization's services.

It seems like the sensible rule of thumb is: If your organization needs that level of control, it's on your organization to provide the device.

◧◩◪◨⬒⬓
6. jacobr+sk1[view] [source] 2022-01-27 20:55:15
>>Initia+5X
Or we could better adopt secure/confidential computing enclaves. This would allow the organization to have control over the silo'd apps and validate some degree of security (code tampering, memory encryption, etc) but not need to trust that other apps on the device or even the OS weren't compromised.
◧◩◪◨⬒⬓⬔
7. wizzwi+bE1[view] [source] 2022-01-27 22:21:58
>>jacobr+sk1
I'm uncomfortable letting organisations have control over the software that runs on my hardware. (Or, really, any hardware I'm compelled to use.)

Suppose the course I've been studying for the past three years now uses $VideoService, but $VideoService uses remote attestation and gates the videos behind a retinal scan, ten distinct fingerprints, the last year's GPS history and the entire contents of my hard drive?¹ If I could spoof the traffic to $VideoService, I could get the video anyway, but every request is signed by the secure enclave. (I can't get the video off somebody else, because it uses the webcam to identify when a camera-like object is pointed at the screen. They can't bypass that, because of the remote attestation.)

If I don't have ten fingers, and I'm required to scan ten fingerprints to continue, and I can't send fake data because my computer has betrayed me, what recourse is there?

¹: exaggeration; no real-world company has quite these requirements, to my knowledge

[go to top]