zlacker

[parent] [thread] 60 comments
1. woah+(OP)[view] [source] 2021-04-07 17:31:45
A lot of these comments are just manifestations of the kneejerk HN "crypto bad" reflex. Here's the deal:

- Whether or not Signal's server is open source has nothing to do with security. Signal's security rests on the user's knowledge that the open source client is encrypting messages end to end. With that knowledge, the server code could be anything, and Signal inc. would still not be able to read your messages. In fact, having the server code open source adds absolutely nothing to this security model, because no matter how open source and secure the server code might be, Signal inc. could still be logging messages upstream of it. The security rests only upon the open source client code. The server is completely orthogonal to security.

- Signal's decision to keep early development of the MobileCoin feature set private was valid. Signal is not your weekend node.js module with two stars on Github. When changes get made to the repo, they will be noticed. This might mess up their marketing plan, especially if they weren't even sure whether they were going to end up going live with the feature. Signal is playing in the big leagues, competing with messengers which have billions of dollars in marketing budget, will never ever be even the smallest amount open source, and are selling all your messages to the highest bidder. They can't afford to handicap themselves just to keep some guys on Hacker News happy.

- Signal's decision to keep development to the (private) master branch, instead of splitting the MobileCoin integration into a long-running feature branch is a valid choice. It's a lot of work to keep a feature branch up to date over years, and to split every feature up into the public and non-public components which then get committed to separate branches. This would greatly affect their architecture and slow down shipping for no benefit, given that the open sourceness of the server is orthogonal to security.

replies(10): >>lucide+5a >>gojomo+xr >>codeth+4s >>pmlnr+pu >>unhamm+zu >>iudqno+xw >>emptys+gE >>Cynicu+XW >>dang+le1 >>hda2+mR1
2. lucide+5a[view] [source] 2021-04-07 18:11:54
>>woah+(OP)
> - Whether or not Signal's server is open source has nothing to do with security. [...] having the server code open source adds absolutely nothing to this security model, [...] The security rests only upon the open source client code. The server is completely orthogonal to security.

The issue a lot of people have with Signal is that your definition here of where security comes from is an extremely narrow & technical one, and many would rather look at security in a more holistic manner.

The problem with messaging security is that there's two ends, and individually we only control one of them. Ok, screenshotting & leaking your messages will always be a concern no matter what technology we develop, but the other challenge is just getting the other end to use Signal in the first place and that's governed by the network effect of competitors.

Open Source is essential for security because one of the most fundamental security features we can possibly hope to gain is platform mobility. Signal doesn't offer any. If Signal gains mass adoption and the server changes, we're right back to our current security challenge: getting your contacts onto the new secure thing.

replies(2): >>kreetx+Ve >>woah+xU3
◧◩
3. kreetx+Ve[view] [source] [discussion] 2021-04-07 18:32:14
>>lucide+5a
But now the server code is there, so we now have this mobility, no?
replies(2): >>acrisp+qg >>lucide+8j
◧◩◪
4. acrisp+qg[view] [source] [discussion] 2021-04-07 18:38:47
>>kreetx+Ve
Until they decide to go silent for another 11 months
replies(1): >>kreetx+hO
◧◩◪
5. lucide+8j[view] [source] [discussion] 2021-04-07 18:49:25
>>kreetx+Ve
Yes and no.

Signal is not actually designed with mobility in mind (in fact I would argue, based on Moxie's 36C3 talks, it was designed to be and continues to be persistently kept anti-mobility). That fact is independent of it being open- or closed-source.

However, if the server is open-source, it opens the door for future mobility in the event of org change. If it's closed-source, you get what's currently happening with WhatsApp.

In actuality, if we had something federated, with mobility pre-baked in, having a closed-source server would be less of a security-risk (the gp's comments on only needing to trust the client would apply more strongly since mobility removes the power to change from server maintainers)

Basically:

- with multi-server clients (e.g. Matrix/OMEMO), you have no dependency on any orgs' server, so their being open-source is less relevant (provided the protocol remains open—this can still go wrong, e.g. with GChat/FBMessenger's use of XMPP).

- with single-server clients (Telegram/WhatsApp/Signal), you are dependent on a single server, so that server being open-source is important to ensure the community can make changes in the event of org change.

replies(1): >>kreetx+ON
6. gojomo+xr[view] [source] 2021-04-07 19:22:42
>>woah+(OP)
Signal Foundation has legitimate self-serving strategic reasons to prefer such secrecy, sure.

But users also have legitimate reasons to want more transparency into both source-code & strategy.

Whether such secrecy best serves the users & the cause of private messaging is an open question.

replies(1): >>frombo+KZ4
7. codeth+4s[view] [source] 2021-04-07 19:25:24
>>woah+(OP)
> Whether or not Signal's server is open source has nothing to do with security

This true only when you are exclusively concerned about your messages' content but not about the metadata. As we all know, though, the metadata is the valuable stuff.

There is a second reason it is wrong, though: These days, lots of actual user data (i.e. != metadata) gets uploaded to the Signal servers[0] and encrypted with the user's Signal PIN (modulo some key derivation function). Unfortunately, many users choose an insecure PIN, not a passphrase with lots of entropy, so the derived encryption key isn't particularly strong. (IMO it doesn't help that it's called a PIN. They should rather call it "ultra-secure master passphrase".) This is where a technology called Intel SGX comes into play: It provides remote attestation that the code running on the servers is the real deal, i.e. the trusted and verified code, and not the code with the added backdoor. So yes, the server code does need to be published and verified.

Finally, let's not forget the fact that SGX doesn't seem particularly secure, either[1], so it's even more important that the Signal developers be open about the server code.

[0]: https://signal.org/blog/secure-value-recovery/

[1]: https://blog.cryptographyengineering.com/2020/07/10/a-few-th...

replies(3): >>codeth+vv >>wolver+HE >>im3w1l+N31
8. pmlnr+pu[view] [source] 2021-04-07 19:33:39
>>woah+(OP)
> A lot of these comments are just manifestations of the kneejerk HN "crypto bad" reflex.

Nope. It's a reaction to "who the f* asked for this in a messaging app?!".

replies(2): >>neolog+8X >>cptski+N91
9. unhamm+zu[view] [source] 2021-04-07 19:33:58
>>woah+(OP)
> The security rests only upon the open source client code. The server is completely orthogonal to security.

For Android at least, builds are reproducible https://signal.org/blog/reproducible-android/ (would be neat if there was one or more third party CI's that also checked that the CI-built app reproduces the one on Google Play Store – or maybe there already are?)

replies(2): >>cptski+la1 >>hnjst+Fp4
◧◩
10. codeth+vv[view] [source] [discussion] 2021-04-07 19:38:05
>>codeth+4s
Addendum: Out of pure interest I just went into a deep dive into the Signal-Android repository and tried to figure out where exactly the SGX remote attestation happens. I figured that somewhere in the app there should be hash or something of the code running on the servers.

Unfortunately, `rg -i SGX` only yielded the following two pieces of code:

https://github.com/signalapp/Signal-Android/blob/master/libs...

https://github.com/signalapp/Signal-Android/blob/master/libs...

No immediate sign of a fixed hash. Instead, it looks like the code only verifies the certificate chain of some signature? How does this help if we want to verify the server is running a specific version of the code and we cannot trust the certificate issuer (whether it's Intel or Signal)?

I'm probably (hopefully) wrong here, so maybe someone else who's more familiar with the code could chime in here and explain this to me? :)

replies(2): >>corbiq+mS3 >>codeth+YY3
11. iudqno+xw[view] [source] 2021-04-07 19:43:11
>>woah+(OP)
Focussing on whether the changes directly make things insecure is missing the point. Fundamentally this sort of security is about trust.

While it's nice to try to have Signal be resilient to attacks by the core team, there just aren't enough community-minded independent volunteer code reviewers to reliably catch them up. I doubt the signal foundation gets any significant volunteer efforts, even by programmers who aren't security experts.

That means I need to decide if I trust the Signal Foundation. Shilling sketchy cryptocurrencies is indicative of loose morals, which makes me think I was wrong to trust them in the past.

replies(1): >>cptski+R81
12. emptys+gE[view] [source] 2021-04-07 20:14:23
>>woah+(OP)
You're apologizing for a project that has repeatedly damaged user trust with excuses.

These are "valid" reasons for keeping the source code private for a year? By whose book? Yours? Certainly not by mine. I wouldn't let any other business abscond from its promise to keep open source open source in spirit and practice, why would I let Signal?

This is some underhanded, sneaky maneuvering I'm more used to seeing from the Amazons and the Facebooks of the world. These are not the actions of an ethically Good organization. And as has already been demonstrated by Moxie in his lust to power, he's more than capable of deviance. On Wire vs Signal: "He claimed that we had copied his work and demanded that we either recreate it without looking at his code, or take a license from him and add his copyright header to our code. We explained that we have not copied his work. His behavior was concerning and went beyond a reasonable business exchange — he claimed to have recorded a phone call with me without my knowledge or consent, and he threatened to go public with information about alleged vulnerabilities in Wire’s implementation that he refused to identify." [1]

These are not the machinations of the crypto-idealist, scrappy underdog for justice we are painted by such publications as the New Yorker. This is some straight up cartoon villain twirling their moustache plotting.

So now I'm being sold on a business vision that was just so hot the public's eyes couldn't bear it? We're talking about a pre-mined cryptocurrency that its inventors are laughing themselves to the bank with.

At least Pavel Durov of Telegram is honest with his users. At least we have Element doing their work in the open for all to see with the Matrix protocol. There are better, more ethical, less shady organizations out there who we can and ought to be putting our trust in, not this freakshow of a morally-compromised shamble.

[1] https://medium.com/@wireapp/axolotl-and-proteus-788519b186a7

replies(2): >>selykg+7H >>mr_woo+ZI8
◧◩
13. wolver+HE[view] [source] [discussion] 2021-04-07 20:16:20
>>codeth+4s
> These days, lots of actual user data (i.e. != metadata) gets uploaded to the Signal servers[0] and encrypted with the user's Signal PIN (modulo some key derivation function). Unfortunately, many users choose an insecure PIN, not a passphrase with lots of entropy, so the derived encryption key isn't particularly strong.

If I understand what you are saying and what Signal says, Signal anticipates this problem and provides a solution that is arguably optimal:

https://signal.org/blog/secure-value-recovery/

My (limited) understanding is that the master key consists of the user PIN plus c2, a 256 bit code generated by a secure RNG, and that the Signal client uses a key derivation function to maximize the master key's entropy. c2 is stored in SGX on Signal's servers. If the user PIN is sufficiently secure, c2's security won't matter - an attacker with c2 still can't bypass the PIN. If the PIN is not sufficiently secure, as often happens, c2 stored in SGX might be the most secure way to augment it while still making the the data recoverable.

I'd love to hear from a security specialist regarding this scheme. I'm not one and I had only limited time to study the link above.

replies(1): >>codeth+CT
◧◩
14. selykg+7H[view] [source] [discussion] 2021-04-07 20:25:47
>>emptys+gE
Repeatedly? This is the first I'm aware of, what are the others?
◧◩◪◨
15. kreetx+ON[view] [source] [discussion] 2021-04-07 20:56:31
>>lucide+8j
So in principle we do have this mobility because you can run your own servers. Perhaps it is not all that unlikely that they will do a bridge to matrix.
replies(1): >>lucide+On2
◧◩◪◨
16. kreetx+hO[view] [source] [discussion] 2021-04-07 20:58:11
>>acrisp+qg
Most of the popular chat-app space is not open source. What is it with Signal that people feel entitled to condemn it for not having the latest commits on github?
replies(2): >>neolog+uX >>acrisp+3T1
◧◩◪
17. codeth+CT[view] [source] [discussion] 2021-04-07 21:20:58
>>wolver+HE
> If I understand what you are saying and what Signal says, Signal anticipates this problem and provides a solution that is arguably optimal

Yep, this is what I meant when I said "This is where a technology called Intel SGX comes into play". :)

And you're right, SGX is better than nothing if you accept that people use insecure PINs. My argument mainly was that

- the UI is designed in the worst possible way and actually encourages people to choose a short insecure PIN instead of recommending a longer one. This means that security guarantees suddenly rest entirely on SGX.

- SGX requires the server code to be verified and published (which it wasn't until yesterday). Without verification, it's all pointless.

> uses a key derivation function to maximize the master key's entropy

Nitpick: Technically, the KDF is deterministic, so it cannot change the entropy and – as the article says – you could still brute-force short PINs (if it weren't for SGX).

> I'd love to hear from a security specialist regarding this scheme. I'm not one and I had only limited time to study the link above.

Have a look at link [1] in my previous comment. :)

18. Cynicu+XW[view] [source] 2021-04-07 21:38:55
>>woah+(OP)
>kneejerk HN "crypto bad" reflex

You make it sound as if that's a bad thing. Reflexes are beneficial when the treats are real. And the crypto“currency” multi-level marketing pyramid schemes are very real. They do nothing but induce greed, gambling, spam, and all-around toxic behavior. It's a digital cancer that needs to end.

◧◩
19. neolog+8X[view] [source] [discussion] 2021-04-07 21:39:31
>>pmlnr+pu
Text isn't the only thing I want to be able to send to people. I wish there were a universal "send thing" api that could be implemented for text, images, money, whatever.
replies(1): >>dunefo+4H8
◧◩◪◨⬒
20. neolog+uX[view] [source] [discussion] 2021-04-07 21:41:22
>>kreetx+hO
What is it with chat apps that people don't condemn them for being closed source? Imagine if GCC hid their changes for a year.
replies(1): >>kreetx+4l1
◧◩
21. im3w1l+N31[view] [source] [discussion] 2021-04-07 22:13:42
>>codeth+4s
SGX is just the processor pinky swearing (signed with Intel keys) that everything is totally legit. Nation State Adversaries can and will take Intel's keys and lie.
replies(1): >>codeth+c51
◧◩◪
22. codeth+c51[view] [source] [discussion] 2021-04-07 22:20:00
>>im3w1l+N31
SGX is also supposed to protect against Signal as a potential adversary, though, as well as against hackers. Or at least that's how I understood the blog article.
◧◩
23. cptski+R81[view] [source] [discussion] 2021-04-07 22:40:54
>>iudqno+xw
> Shilling sketchy cryptocurrencies is indicative of loose morals, which makes me think I was wrong to trust them in the past.

Who decided it was sketchy?

The "I don't like change so I'm going to piss all over you" attitude is what sinks a lot good things.

How does Signal benefit from being a shill for this coin? Are they being paid by MOB or do they get a % of the cut?

So far all I've read are people screaming their heads off that MOB eats babies and how dare Signal stoop so low as to even fart in their general direction, but I have yet to see anyone explain why MOB is bad or how Signal is bad for giving MOB a platform.

replies(3): >>boston+4b1 >>iudqno+9b1 >>parano+Dw1
◧◩
24. cptski+N91[view] [source] [discussion] 2021-04-07 22:44:54
>>pmlnr+pu
Unless you have your head thoroughly buried in the sand, you'd understand that all the major players allow people to send money AND people are using those platforms to send money.

When people evaluate a new messaging client, the minimum feature set required to be considered viable now includes sending money for a lot of the population.

* removed insult

replies(1): >>pmlnr+q92
◧◩
25. cptski+la1[view] [source] [discussion] 2021-04-07 22:47:57
>>unhamm+zu
That's pretty neat, I wasn't aware that was possible.
◧◩◪
26. boston+4b1[view] [source] [discussion] 2021-04-07 22:51:42
>>cptski+R81
Yea, I'm bearish on cryptocurrencies, but I think moxie and his team have built up an incredible amount of goodwill in my book. Enough for me to hear out their solution before making a decision. I'm assuming they didn't write dogecoin2 or even a bitcoin clone. It will be interesting to learn about it.
◧◩◪
27. iudqno+9b1[view] [source] [discussion] 2021-04-07 22:52:11
>>cptski+R81
> How does Signal benefit from being a shill for this coin? Are they being paid by MOB or do they get a % of the cut?

The CEO of signal messenger LLC was/is the CTO of MOB.

See https://www.reddit.com/r/signal/comments/mm6nad/bought_mobil... and https://www.wired.com/story/signal-mobilecoin-payments-messa...

replies(1): >>cptski+zo3
28. dang+le1[view] [source] 2021-04-07 23:10:22
>>woah+(OP)
> A lot of these comments are just manifestations of the kneejerk HN "crypto bad" reflex

The contrarian dynamic strikes again: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...

◧◩◪◨⬒⬓
29. kreetx+4l1[view] [source] [discussion] 2021-04-07 23:56:48
>>neolog+uX
Sure, it would be nice if any software were open source, but that you are entitled for it? Funny attitude.
replies(1): >>neolog+0o1
◧◩◪◨⬒⬓⬔
30. neolog+0o1[view] [source] [discussion] 2021-04-08 00:16:24
>>kreetx+4l1
There's plenty of writing on that issue [1]. It makes a lot of sense to think of people being actually entitled to certain rights, especially in domains with network effects.

Btw, the Signal Foundation is a non-profit organization that benefits from community goodwill based on an open-source ethos. So people are critical when its software is closed source.

[1] https://www.gnu.org/philosophy/free-sw.en.html

replies(2): >>rOOb85+jH1 >>kreetx+wo2
◧◩◪
31. parano+Dw1[view] [source] [discussion] 2021-04-08 01:17:19
>>cptski+R81
Part of the problem is that at the moment any government trying to force Signal to break the e2e security model is clearly interfering with speech.

By incorporating cryptocurrency/payments, governments are being handed a massive lever to force Signal to comply with the financial monitoring requirements that governments have in place.

This has a negative impact on those of us who just wanted a secure communications platform.

replies(1): >>cptski+Kp3
◧◩◪◨⬒⬓⬔⧯
32. rOOb85+jH1[view] [source] [discussion] 2021-04-08 02:54:19
>>neolog+0o1
...it's software is open source.
replies(1): >>neolog+SI1
◧◩◪◨⬒⬓⬔⧯▣
33. neolog+SI1[view] [source] [discussion] 2021-04-08 03:08:58
>>rOOb85+jH1
The reason is that this story is on HN is that the source was previously missing.
34. hda2+mR1[view] [source] 2021-04-08 04:49:04
>>woah+(OP)
> Whether or not Signal's server is open source has nothing to do with security.

Wrong. Availability is a core component of security. Keeping your server implementation closed and only available from one single entity gives signal a failing grade in my book.

Signal is great for one-off messaging but is a poor long-term solution for secure instant messaging.

◧◩◪◨⬒
35. acrisp+3T1[view] [source] [discussion] 2021-04-08 05:14:27
>>kreetx+hO
By silent, I don't just mean they held back commits. They were evasive about it the entire time. They could have explained and chose not to.

They don't owe me anything but I think it's a shame that the leading open source messenger app does such a poor job of communicating with its users and the larger open source community.

◧◩◪
36. pmlnr+q92[view] [source] [discussion] 2021-04-08 08:17:24
>>cptski+N91
> all the major players allow people to send money AND people are using those platforms to send money.

So if major players jump off a cliff, everyone should always follow?

> the minimum feature set required to be considered viable now includes sending money for a lot of the population.

? No, thank you. Not where the actual banking system is working.

replies(1): >>cptski+1l3
◧◩◪◨⬒
37. lucide+On2[view] [source] [discussion] 2021-04-08 10:45:25
>>kreetx+ON
You cannot currently run your own Signal server, no. That's what prevents mobility.

You are free to examine the source of theirs (if they choose to continue releasing it), but you cannot self-host.

replies(1): >>kreetx+kJ2
◧◩◪◨⬒⬓⬔⧯
38. kreetx+wo2[view] [source] [discussion] 2021-04-08 10:53:35
>>neolog+0o1
I don't think a piece on gnu.org qualifies as "plenty of writing" and for sure doesn't count as basis for what you are entitled for :).
replies(1): >>neolog+kM3
◧◩◪◨⬒⬓
39. kreetx+kJ2[view] [source] [discussion] 2021-04-08 13:40:33
>>lucide+On2
If both the code and the server are open source then how come you can't run it?
replies(1): >>lucide+Bd3
◧◩◪◨⬒⬓⬔
40. lucide+Bd3[view] [source] [discussion] 2021-04-08 16:02:40
>>kreetx+kJ2
If you checkout the client source, compile it, and install it on your own mobile device, you can then connect it to your own self-hosted server instance. However Signal's own server instance will then block your client (and there's no way to connect the client binaries they distribute to anything but their own server).

So you would have to then follow the above steps for any contacts you want to communicate with, distributing your own client to them. Signal devs have generally been extremely hostile toward anyone wishing to do this however.

The only way out of this situation would be if the Signal project itself was forked and people moved to that forked open-source multi-server client.

replies(1): >>kreetx+zV3
◧◩◪◨
41. cptski+1l3[view] [source] [discussion] 2021-04-08 16:39:19
>>pmlnr+q92
Just because you don't like it or don't find it relevant in your local circle doesn't mean it is without value. Last time I traveled to Italy, hailing and paying for taxis was handled exclusively over OTT messaging.
◧◩◪◨
42. cptski+zo3[view] [source] [discussion] 2021-04-08 16:56:14
>>iudqno+9b1
> The CEO of signal messenger LLC was/is the CTO of MOB.

But in Edit 4 of the Reddit post it says

> Also, the extended whitepaper wrongly cites Moxie as chief technology officer, while he is the technical advisor.

On top of that, this whitepaper that's being passed around is a forgery with 1.5 pages of factually incorrect information.

The Wired article you link isn't even really critical, it just matter-of-factly explains the feature and it's back story.

> Signal's choice of MobileCoin is no surprise for anyone watching the cryptocurrency's development since it launched in late 2017. Marlinspike has served as a paid technical adviser for the project since its inception, and he's worked with Goldbard to design MobileCoin's mechanics with a possible future integration into apps like Signal in mind. (Marlinspike notes, however, that neither he nor Signal own any MobileCoins.)

It seems like you're trying conclusions based on lies you read 2nd or 3rd hand and didn't bother to verify. The reddit post you linked to would be received very differently depending on when you read it since it's been edited numerous times with addendums refuting earlier claims. Only by reading from beginning to end with all edits can you start to see a clear picture. Even then the picture I see is someone backpedaling a lot of false claims they made.

◧◩◪◨
43. cptski+Kp3[view] [source] [discussion] 2021-04-08 17:01:24
>>parano+Dw1
Thank you, this is honestly the only clear and valid argument I've seen from anyone around the feature.

Everyone else is trying to pass off false information, doctored white papers, and all sort of conspiracy theories to support their dislike for the feature.

◧◩◪◨⬒⬓⬔⧯▣
44. neolog+kM3[view] [source] [discussion] 2021-04-08 19:15:39
>>kreetx+wo2
> I don't think a piece on gnu.org qualifies as "plenty of writing"

There are some links there to other pieces if you want to read more about it.

> for sure doesn't count as basis for what you are entitled for

I'm not claiming that moral authority flows from the Gnu brand; rather, they provide some information and reasoning which people can use to come to their own conclusions.

replies(1): >>kreetx+kX3
◧◩◪
45. corbiq+mS3[view] [source] [discussion] 2021-04-08 19:49:33
>>codeth+vv
The hash of the code that is running in the enclave is called "MRENCLAVE" in SGX.

During remote attestation, the prover (here, Signal's server) create a "quote" that proves it is running a genuine enclave. The quote also includes the MRENCLAVE value.

It sends the quote to the verifier (here, Signal-Andriod), which in turn sends it to Intel Attestation Service (IAS). IAS verifies the quote, then signs the content of the quote, thus signing the MRENCLAVE value. The digital signature is sent back to the verifier.

Assuming that the verifier trusts IAS's public key (e.g., through a certificate), it can verify the digital signature, thus trust the MRENCLAVE value is valid.

The code where the verifier is verifying the IAS signature is here: https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

The code where the MRENCLAVE value is checked is here: https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

Hope this helps!

replies(1): >>codeth+lc4
◧◩
46. woah+xU3[view] [source] [discussion] 2021-04-08 20:01:39
>>lucide+5a
You're redefining the word "security" here to an incredibly expansive definition which includes all kinds of details about the ability for someone else to set up an interoperable service.
replies(1): >>lucide+uf7
◧◩◪◨⬒⬓⬔⧯
47. kreetx+zV3[view] [source] [discussion] 2021-04-08 20:08:18
>>lucide+Bd3
Ok, but they should be forced then to do the things they don't want to do?

What I mean is: if Signal is not Elment.io/matrix, and that the latter is better for freedom and openness, then one can agree with with that. But what I don't understand is the demand from people that Signal somehow owes them the ability to be like matrix, be federated, etc. and also be so judgemental about it, is what rubs me the wrong way.

replies(1): >>lucide+vg7
◧◩◪◨⬒⬓⬔⧯▣▦
48. kreetx+kX3[view] [source] [discussion] 2021-04-08 20:20:04
>>neolog+kM3
Most if not all of the links point to themselves..

It's ok to think that in an ideal world it would be like that, but argumenting as if you were entitled to the source because of it doesn't seem that it will persuade others. After all, if you aren't empathetic to the reality, how would you expect others be empathetic to you?

replies(1): >>neolog+jH4
◧◩◪
49. codeth+YY3[view] [source] [discussion] 2021-04-08 20:28:45
>>codeth+vv
Addendum to the addendum: Whether there's a fixed hash inside the Signal app or not, here's one thing that crossed my mind last night that I have yet to understand:

Let's say we have a Signal-Android client C, and the Signal developers are running two Signal servers A and a B.

Suppose server A is running a publicly verified version of Signal-Server inside an SGX enclave, i.e. the source code is available on GitHub and has been audited, and server B is a rogue server, running a version of Signal-Server that comes with a backdoor. Server B is not running inside an SGX enclave but since it was set up by the Signal developers (or they were forced to do so) it does have the Signal TLS certificates needed to impersonate a legitimate Signal server (leaving aside SGX for a second). To simplify things, let's assume both servers' IPs are hard-coded in the Signal app and the client simply picks one at random.

Now suppose C connects to B to store its c2 value[0] and expects the server to return a remote attestation signature along with the response. What is stopping server B then from forwarding the client's request to A (in its original, encrypted and signed form), taking A's response (including the remote attestation signature) and sending it back to C? That way, server B could get its hands on the crucial secret value c2 and, as a consequence, later on brute-force the client's Signal PIN, without C ever noticing that B is not running the verified version of Signal-Server.

What am I missing here?

Obviously, Signal's cloud infrastructure is much more complicated than that, see [0], so the above example has to be adapted accordingly. In particular, according to the blog post, clients do remote attestation with certain "frontend servers" and behind the frontend servers there are a number of Raft nodes and they all do remote attestation with one another. So the real-life scenario would be a bit more complicated but I wanted to keep it simple. The point, in any case, is this: Since the Signal developers are in possession of all relevant TLS certificates and are also in control of the infrastructure, they can always MITM any of their legitimate endpoints (where the incoming TLS requests from clients get decrypted) and put a rogue server in between.

One possible way out might be to generate the TLS keys inside the SGX enclave, extract the public key through some public interface while keeping the private key in the encrypted RAM. This way, the public key can still be baked into the client apps but the private key cannot be used for attacks like the one above. However, for this the clients would once again need to know the code running on the servers and do remote attestation, which brings us back to my previous question – where in Signal-Android is that hash of the server code[1]?

[0]: https://signal.org/blog/secure-value-recovery/

[1]: More precisely, the code of the frontend enclave, since the blog post[0] states that its the frontend servers that clients do the TLS handshake with:

> We also wanted to offload the client handshake and request validation process to stateless frontend enclaves that are designed to be disposable.

replies(1): >>kijiki+Q64
◧◩◪◨
50. kijiki+Q64[view] [source] [discussion] 2021-04-08 21:15:34
>>codeth+YY3
TLS is used, but there is another layer of encryption e2e from the client to inside the enclave. Your MITM server B can decrypt the TLS layer, but still can't see the actual traffic.
replies(1): >>codeth+L74
◧◩◪◨⬒
51. codeth+L74[view] [source] [discussion] 2021-04-08 21:22:02
>>kijiki+Q64
Just came back to post this but you beat me to it haha. Thank you! :) I just looked at the SGX 101 book and found the relevant piece: Client and enclave are basically doing a DH key exchange. https://sgx101.gitbook.io/sgx101/sgx-bootstrap/attestation#s...
◧◩◪◨
52. codeth+lc4[view] [source] [discussion] 2021-04-08 21:45:39
>>corbiq+mS3
Thank you, this indeed helps a lot and I've now found the hashes![1] Moreover, thank you for providing me with some additional keywords that I could google for – this made it much easier to follow the SGX 101 "book"[0] and I think I've got a much better grasp now on how SGX works!

[0]: https://sgx101.gitbook.io/sgx101/sgx-bootstrap/attestation#r...

[1]: For other people interested in this matter: I've followed the path of the MRENCLAVE variable to the very end,

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

where it gets injected by the build config. The build config, in turn, is available here:

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

(The MRENCLAVE values can be found around line 120.)

replies(2): >>codeth+Xq4 >>codeth+XC4
◧◩
53. hnjst+Fp4[view] [source] [discussion] 2021-04-08 23:25:27
>>unhamm+zu
While I find laudable this effort and consider it a valid security improvement, the fact that Signal is opposed to alternate clients (and even builds from other sources like FDroid) opens another orthogonal risk.

I doubt that many people rebuild the app at each update to check that the new binaries match the ones provided by their store. If, for example, the PlayStore distributed at large a binary that doesn't match the published sources, some dedicated user would probably spot the issue.

However, the PlayStore (and Signal, but it's not even necessary for the following) being under US jurisdiction, any user not checking each update it receives is vulnerable to the NSL + gag order famous combo in case of a targeted attack. I recognize that this is probably something that most people do not include in their threat model but I'm still a bit dubious about the fact that convenience related to release management and not having to worry about interoperability is worth accepting the risks linked to a unique delivery channel, especially for what could (and is widely thought to) be a completely secure IM solution. "Almost secure" is frighteningly the worse obstacle to "secure"...

I'm admittedly biased since I'm convinced that federation, multiple client/server implementations and multiple distribution channels are a requirement for a secure IM infrastructure (which is why my heart goes to Matrix nowadays).

◧◩◪◨⬒
54. codeth+Xq4[view] [source] [discussion] 2021-04-08 23:36:27
>>codeth+lc4
Matthew Green's tweet[0] has now sent me down the rabbit hole again as I was trying to figure out where the IAS's certificate gets verified. (Without verification the IAS's attestation that the enclave's quote carries the correct signature would obviously be worthless.) I wanted to find out whether it gets pinned or whether Signal trusts a CA. It seems to be the latter:

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

For completeness, let's also have a look at where the CA certificate(s) come(s) from. The PKIX parameters[1] are retrieved from the trustStore aka iasKeyStore which, as we follow the rabbit hole back up, gets instantiated here:

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

As we can see, the input data comes from

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

`R.raw.ias` in line 20 refers to the file `app/src/main/res/raw/ias.store` in the repository and as we can see from line 25 it's encrypted with the password "whisper" (which seems weird but it looks like this a requirement[2] of the API). I don't have time to look at the file right now but it will probably (hopefully) contain only[3] Intel's CA certificate and not an even broader one. At least this is somewhat suggested by the link I posted earlier:

https://github.com/signalapp/Signal-Android/blob/master/libs...

In any case, it seems clear that the IAS's certificate itself doesn't get pinned. Not that it really matters at this point. Whether the certificate gets pinned or not, an attacker only needs access to the IAS server, anyway, to steal its private key. Then again, trusting a CA (and thus any certificate derived from it) obviously widens the attack vector. OTOH it might be that Intel is running a large array of IAS servers that come and go and there is no guarantee on Intel's part that a pinned certificate will still be valid tomorrow. In this case, the Signal developers obviously can't do anything about that.

[0]: https://twitter.com/matthew_d_green/status/13802817973139742...

[1]: https://docs.oracle.com/javase/8/docs/api/index.html?java/se...

[2]: https://stackoverflow.com/questions/4065379/how-to-create-a-...

[3]: If it contains multiple CA certificates, each one of them will be trusted, compare [1].

◧◩◪◨⬒
55. codeth+XC4[view] [source] [discussion] 2021-04-09 01:31:37
>>codeth+lc4
Correction: The MRENCLAVE value is the one on line 123, compare the signature of the `KbsEnclave` constructor -> https://github.com/signalapp/Signal-Android/blob/7394b4ac277...
◧◩◪◨⬒⬓⬔⧯▣▦▧
56. neolog+jH4[view] [source] [discussion] 2021-04-09 02:31:38
>>kreetx+kX3
The reality is pretty diverse. Plenty of people use mostly or only free software. I certainly do.
◧◩
57. frombo+KZ4[view] [source] [discussion] 2021-04-09 06:17:49
>>gojomo+xr
Exactly!

It isn't about cryptocurrency at all. It's about trust.

The relationship between mobilecoin and signal isn't even clear as far as I can tell. You're just asked to look at an obvious cash grab and be willing to accept it as normal operating procedure, and then expected to trust your most private communications to the same individuals that won't reveal their true intentions.

◧◩◪
58. lucide+uf7[view] [source] [discussion] 2021-04-09 20:46:03
>>woah+xU3
Yup. Security is hard.
◧◩◪◨⬒⬓⬔⧯▣
59. lucide+vg7[view] [source] [discussion] 2021-04-09 20:50:53
>>kreetx+zV3
I've tried to approach this thread in good faith, as your earlier replies seemed genuinely curious/discussion oriented, but the "ok, but" tone is making them seem increasingly shill-like.

I don't think anyone's "demanding" or "forcing" anything here. We're simply describing a definition of what we consider desirable as a sustainable secure messaging option, and pointing out the specific reasons that Signal isn't currently living up to that definition.

It's maintainers are free to continue on their way ignoring said definition.

Personally, my own comments are not targeted at Signal devs but rather at others who might consider using Signal thinking it provides certain guarantees when it doesn't.

◧◩◪
60. dunefo+4H8[view] [source] [discussion] 2021-04-10 13:47:45
>>neolog+8X
Like Matrix?
◧◩
61. mr_woo+ZI8[view] [source] [discussion] 2021-04-10 14:08:22
>>emptys+gE
Thanks for linking this, I had no idea this occurred.
[go to top]