zlacker

[parent] [thread] 5 comments
1. nybble+(OP)[view] [source] 2022-01-27 18:54:33
Reproducible builds allow the user of the software to verify the version that they are using or installing. They do not, by themselves, allow the sort of remote attestation which would permit a service to verify the context for authentication—the user, or a malicious actor, could simply modify the device to lie about the software being run.

Secure attestation about device state requires something akin to Secure Boot (with a TPM), and in the context of a BYOD environment precludes the device owner having full control of their own hardware. Obviously this is not an issue if the organization only permits access to its services from devices it owns, but no organization should have that level of control over devices owned by employees, vendors, customers, or anyone else who requires access to the organization's services.

replies(1): >>Initia+p8
2. Initia+p8[view] [source] 2022-01-27 19:29:20
>>nybble+(OP)
> no organization should have that level of control over devices owned by employees, vendors, customers, or anyone else who requires access to the organization's services.

It seems like the sensible rule of thumb is: If your organization needs that level of control, it's on your organization to provide the device.

replies(1): >>jacobr+Mv
◧◩
3. jacobr+Mv[view] [source] [discussion] 2022-01-27 20:55:15
>>Initia+p8
Or we could better adopt secure/confidential computing enclaves. This would allow the organization to have control over the silo'd apps and validate some degree of security (code tampering, memory encryption, etc) but not need to trust that other apps on the device or even the OS weren't compromised.
replies(2): >>wizzwi+vP >>nybble+PW
◧◩◪
4. wizzwi+vP[view] [source] [discussion] 2022-01-27 22:21:58
>>jacobr+Mv
I'm uncomfortable letting organisations have control over the software that runs on my hardware. (Or, really, any hardware I'm compelled to use.)

Suppose the course I've been studying for the past three years now uses $VideoService, but $VideoService uses remote attestation and gates the videos behind a retinal scan, ten distinct fingerprints, the last year's GPS history and the entire contents of my hard drive?¹ If I could spoof the traffic to $VideoService, I could get the video anyway, but every request is signed by the secure enclave. (I can't get the video off somebody else, because it uses the webcam to identify when a camera-like object is pointed at the screen. They can't bypass that, because of the remote attestation.)

If I don't have ten fingers, and I'm required to scan ten fingerprints to continue, and I can't send fake data because my computer has betrayed me, what recourse is there?

¹: exaggeration; no real-world company has quite these requirements, to my knowledge

replies(1): >>jacobr+6x3
◧◩◪
5. nybble+PW[view] [source] [discussion] 2022-01-27 22:54:40
>>jacobr+Mv
Secure enclaves are still dependent on someone other than the owner (usually the manufacturer) having ultimate control over the device. Otherwise the relying party has no reason to believe that the enclave is secure.
◧◩◪◨
6. jacobr+6x3[view] [source] [discussion] 2022-01-28 18:00:04
>>wizzwi+vP
So there are two levels of tradeoffs:

1) The requirements themselves. These are different for consumer vs employee type scenarios. So general, I'd prefer we err on the side of DRM free for things like media, but there are legitimate concerns around things like data privacy when you are an employee of an organization handling sensitive data.

2) Presuming there are legitimate reasons to have strong validation of the user and untampered software, we have the choice of A) using only organization supplied hardware in those case or B) using your own with some kind of restriction. I'd much prefer to use my own as much as possible ... if I can be ensured that it won't spy on me, or limit what I can do, for the non-organization specific purposes I've explicitly opted-in to enable.

> I'm uncomfortable letting organisations have control over the software that runs on my hardware.

I'm not, if we can sandbox. I'm fine with organizations running javascript in my browser for instance. Or running mobile apps that can access certain data with explicit permissions (like granting access to my photos so that I can share them in-app). I think we can do better with both more granular permissions, better UX, and cryptographic guarantees to both the user and the organization that both the computation and data is operating at the agreed level.

[go to top]