zlacker

[parent] [thread] 4 comments
1. Someon+(OP)[view] [source] 2021-04-07 15:10:49
How would that work? You'd be layering trust on trust, wherein if they're willing to lie about one thing they're willing to lie about confirmation of that same thing (or not).

Unless you're going to hire some independent auditor (that you still have to trust) it seems logically problematic.

replies(1): >>madars+v
2. madars+v[view] [source] 2021-04-07 15:13:49
>>Someon+(OP)
SGX enclaves can attest to the code they are running, so you don't exactly need to take Signal's word on faith.
replies(2): >>eptcyk+C1 >>Someon+oo
◧◩
3. eptcyk+C1[view] [source] [discussion] 2021-04-07 15:18:31
>>madars+v
Except SGX enclaves are horribly broken.
replies(1): >>monoca+03
◧◩◪
4. monoca+03[view] [source] [discussion] 2021-04-07 15:23:41
>>eptcyk+C1
Like, does an SGX enclave attest that meltdown is patched in microcode? That's one way to pull the keys out.

The recentish work to get read write access to some Intel CPU's microcode can probably break SGX too. I wouldn't be surprised if the ME code execution flaws could be used that way too.

◧◩
5. Someon+oo[view] [source] [discussion] 2021-04-07 16:55:55
>>madars+v
That isn't a solution to the problem being discussed (a provider's server code being verifiable by end users). I'm quite confused by the suggestion that it could be/is.
[go to top]