Unless you're going to hire some independent auditor (that you still have to trust) it seems logically problematic.
There's a counter-argument that there is still useful metadata a server can glean from its users, but it's certainly minimised with a good protocol... like the Signal protocol.
Auditors
Trusted Enclaves (but then you trust Intel)
Signed chain of I/O with full semantics specified (blockchain style).
Now please remove that cryptocoin stuff from the app. Don’t create another avenue for money laundering, tax evasion and drugs sales (if not worse).
If Signal /was/ federated it would be a strong hint that the server code stays the same.
And even if it's not the same, people would be able to run their own trusted servers.
https://github.com/signalapp/Signal-Server/commit/95f0ce1816...
As far as my understanding goes, it's hardly possible to even verify that a compiled binary represents a faithfully executed representation of the source instructions, let alone that it will execute that way when run through a modern OS and CPU pipeline.
I would think the objective here is more about releasing server code that can be run independently in a way that 1) doesn't involve signal's infrastructure and 2) allows the client/server interactions to be audited in a way that trust of the server side is unnecessary, regardless of what code it may or may not be running.
The recentish work to get read write access to some Intel CPU's microcode can probably break SGX too. I wouldn't be surprised if the ME code execution flaws could be used that way too.
The entirety of the signal "stack" depends on the SGX enclave. The fact that no one, in all time, has bothered to notice that the running code is different than the published code, is telling.
There's actually a newer SGX exploit, and related mitigation, that came to light at about the same time when they released their discovery protocol. Those mitigations were never backported to the base signal functionality. That no one audited and complained about this says quite a lot.
I've not looked at this code dump but perhaps the newer fixes finally made their way in. Or have been there all along.
It was kept under wraps for a grade A pump.
For normal development, I am advocating an always auditable runtime that runs only public source code by design:- https://observablehq.com/@endpointservices/serverless-cells
Before sending data to a URL, you can look up the source code first, as the URL encodes the source location.
There is always the risk I decided to embed a trojan in the runtime (despite it being open source). However, if I am a service provider for 100k customers built upon the idea of a transparent cloud, then compromising the trust of one customer would cause loss of business across all customers. Thus, from a game-theoretic perspective, our incentives should align.
I think running public source code, which does not preclude injecting secrets and keeping data private, is something that normal development teams can do. No PhDs necessary, just normal development.
Follow me on https://twitter.com/tomlarkworthy if you want to see this different way of approaching privacy: always auditable source available server-side implementations. You can trust services implemented this way are safe, because you can always see how they process data. Even if you cannot be bothered to audit their source, the sheer fact that someone can, inoculates you against bad faith implementations.
I am building a transparent cloud. Everything is encoded in public notebooks and runs open-source https://observablehq.com/collection/@endpointservices/servic... There are other benefits, like being able to fork my implementations and customize, but primarily I am doing this for trust through transparency reasons.
People in the user forum (https://community.signalusers.org/t/where-is-new-signal-serv...) and in other places on the internet were upset for months, because the server wasn't being updated anymore. At the same time, Signal regularly tweetet that "all they do is 100% open source", even at a point in time where no source code was released for almost a year.
Just 2 days ago this was getting picked up by some larger tech news platforms:
https://www.golem.de/news/crypto-messenger-signal-server-nic...
https://www.androidpolice.com/2021/04/06/it-looks-like-signa...
It's normal that Signal ignores its users, but apparently they didn't even reply to press inquiries about the source code. All it would have taken is a clear statement like "we're working on a cool new feature and will release the sources once that's ready, please bear with us". Instead, they left people speculating for months.
This communication strategy, combined with the cryptocurrency announcement, may cause serious harm to Signal's reputation.
It’s client apps who verify (via attestation) that the code inside an SGX enclave is what they expect it to be, and clients are open source.
> The entirety of the signal "stack" depends on the SGX enclave
Only private contact discovery depends on trusting SGX.
Note the endpoint does a DYNAMIC lookup of source code. So you can kinda reassure yourself the endpoint is executing dynamic code just by providing your own source code.
It might be more obvious the runtime does nothing much if you see the runtime https://github.com/endpointservices/serverlesscells
The clever bits that actually implement services are all in the notebooks.
If the attestation signature matches the published enclave code, then we can know if there's a match. So either there's a missing mitigation, which no one ever has complained about, or the running enclave code doesn't match the source, which also no one ever has complained about. Without independent audit, there is no verification and we have established that independent parties do not care.
> Only private contact discovery depends on trusting SGX.
uh, no. this is demonstrably and obviously wrong.
It was probably apparent to them that adding the new crypto payments feature would create at least some kind of community pushback.
Waiting until the feature is reasonably complete and can be judged on its merits is good from a business perspective.
If I was evil, I wouldn't have a totally separate source tree and binary that I shipped; I'd have my CI process inject a patch file. As a result, everything would work as expected - including getting any changes from the public source code - but the created binaries would be backdoored.
Not oficially, but see https://news.ycombinator.com/item?id=26725117. They stopped publishing code when they started on the cryptocurrency integration.
It’s quite clear that this crypto integration provides a perverse incentive for the project that points in the opposite direction of security.
> Signal had to verify that MobileCoin worked before exposing their users to the technology. That process took a long time because MobileCoin has lots of complicated moving parts.
> With respect to price, no one truly understands the market. It’s impossible to predict future price.
- https://twitter.com/mobilecoin/status/1379830618876338179
Reeks of utter BS. As the reply on this tweet says, features can be developed while being kept switched off with a flag.
cc: @dang
[0] https://news.ycombinator.com/item?id=26345937
[1] The title is the only thing worth reading in this pile of speculation and hand waving.
But maybe you don't want everyone to know about all the features / announcements months in advance?
"the bulk of users is either too stupid or unwilling to invest even the tiniest amount of effort into their privacy."
I don't feel the need to pull my punches.
This is the most deluded, idiotic response I've seen on hacker news in a long time.
It seems unlikely that the average person (or even a non-techie person of above average intelligence - e.g. a doctor) will be able to set up matrix in a way that is more secure than just installing signal. Your security relies not just on you but on the weakest node in your network. Getting good security might require trade-offs. Your all or nothing mindset will not achieve it. The saying "Perfect is the enemy of done" comes to mind. Perfect security (or what you purpose) is not one of the options in a secure system that has to exist in the real world.
Please remove your head from it's dark cavernous home.
Love, Me
It's been damaging to their claims of transparency for almost a year now, if anything this should be the first step in repairing that slight. How is dumping a year's worth of private work into your public repo somehow doing damage to their trustworthiness?
> The first payments protocol we’ve added support for is a privacy focused payments network called MobileCoin, which has its own currency, MOB.
(Emphasis mine.)
[0] https://signal.org/blog/help-us-test-payments-in-signal/
Prior to seeing this post, I was already concerned that adding a crypto/payments integration would damage the Signal project, and this appears to be an immediate example of the kind of harms/perverse incentives I was concerned about.
(A counterargument to my theory here would perhaps be "Signal was always doing stuff like declining to publish their server code even prior to the payments integration", I'm not familiar enough with the history of the project to know the details there.)
In before "but it's not free as in no cost". That's why big corps will always fuck over the normies. As it stands, one cannot use the internet without either giving away their privacy or learning a lot about computers and how they work and how to use them.
The majority chose the "i don't care, give me shiny app" route. and they fucked us all over by doing so. There's no right to easy privacy friendly computing. There's only the harsh reality that behind friendly blue and rainbow colored companies sit people that will sell a digital recreation of yourself to anyone who cares to pay and give you a few gigs of free e-mail space and a shiny app for it.
They've been obscuring their code for about a year and even then, it's not like Signal has always come out and said "we love the passion our fellow developers have for our commitment to privacy and security". They just let people sell their relatives on that promise and waited until they had a massive userbase to start monetizing their platform.
Thanks for your reply, I just wonder where all this trustworthiness has been coming from for the last 12 months while they've been quietly working on the platform without publishing any changes. It feels like a beta tester for a game being mad that there were lootboxes in the full release of the game when they weren't in the beta. Even if you didn't know they were coming, you had to assume something like it was inevitable given enough traction.
E2E encryption only helps you verify WHO you are connecting to, not what they are doing with your connection once it is established.
It's especially good to have this information while said coin is already publicly traded on some (albeit obscure) exchanges. Because that allows you, as an insider, to slowly accumulate a nicely sized position in that coin while it's still cheap. "MobileCoin" wasn't publicly traded until December of last year, there wasn't even a live blockchain until that month, so if it's true that the first commits in the Signal server repo hinting at the crypto payment plans were created shortly after the repo went silent in April 2020, it is obvious that keeping these commits secret until at least December 2020 was crucial to successfully realize these good personal business perspectives - a.k.a. insider trading, but nobody cares about it if it's crypto.
The problem is more generally called trusted computing, with Intel SGX being an implementation (albeit one with a pretty bad track record).
- Whether or not Signal's server is open source has nothing to do with security. Signal's security rests on the user's knowledge that the open source client is encrypting messages end to end. With that knowledge, the server code could be anything, and Signal inc. would still not be able to read your messages. In fact, having the server code open source adds absolutely nothing to this security model, because no matter how open source and secure the server code might be, Signal inc. could still be logging messages upstream of it. The security rests only upon the open source client code. The server is completely orthogonal to security.
- Signal's decision to keep early development of the MobileCoin feature set private was valid. Signal is not your weekend node.js module with two stars on Github. When changes get made to the repo, they will be noticed. This might mess up their marketing plan, especially if they weren't even sure whether they were going to end up going live with the feature. Signal is playing in the big leagues, competing with messengers which have billions of dollars in marketing budget, will never ever be even the smallest amount open source, and are selling all your messages to the highest bidder. They can't afford to handicap themselves just to keep some guys on Hacker News happy.
- Signal's decision to keep development to the (private) master branch, instead of splitting the MobileCoin integration into a long-running feature branch is a valid choice. It's a lot of work to keep a feature branch up to date over years, and to split every feature up into the public and non-public components which then get committed to separate branches. This would greatly affect their architecture and slow down shipping for no benefit, given that the open sourceness of the server is orthogonal to security.
Anyway, Signal is designed to handle all the private bits at the client side with e2ee so you have to put as little trust in the server as possible.
The issue a lot of people have with Signal is that your definition here of where security comes from is an extremely narrow & technical one, and many would rather look at security in a more holistic manner.
The problem with messaging security is that there's two ends, and individually we only control one of them. Ok, screenshotting & leaking your messages will always be a concern no matter what technology we develop, but the other challenge is just getting the other end to use Signal in the first place and that's governed by the network effect of competitors.
Open Source is essential for security because one of the most fundamental security features we can possibly hope to gain is platform mobility. Signal doesn't offer any. If Signal gains mass adoption and the server changes, we're right back to our current security challenge: getting your contacts onto the new secure thing.
But the standard Free Software development/distribution model does lack in some areas. And so Signal got a bunch of community leeway for going against the grain, in the hopes that a fresh approach would somehow bear fruit.
We're now apparently seeing some of the fruit from that approach.
Signal is not actually designed with mobility in mind (in fact I would argue, based on Moxie's 36C3 talks, it was designed to be and continues to be persistently kept anti-mobility). That fact is independent of it being open- or closed-source.
However, if the server is open-source, it opens the door for future mobility in the event of org change. If it's closed-source, you get what's currently happening with WhatsApp.
In actuality, if we had something federated, with mobility pre-baked in, having a closed-source server would be less of a security-risk (the gp's comments on only needing to trust the client would apply more strongly since mobility removes the power to change from server maintainers)
Basically:
- with multi-server clients (e.g. Matrix/OMEMO), you have no dependency on any orgs' server, so their being open-source is less relevant (provided the protocol remains open—this can still go wrong, e.g. with GChat/FBMessenger's use of XMPP).
- with single-server clients (Telegram/WhatsApp/Signal), you are dependent on a single server, so that server being open-source is important to ensure the community can make changes in the event of org change.
But users also have legitimate reasons to want more transparency into both source-code & strategy.
Whether such secrecy best serves the users & the cause of private messaging is an open question.
This true only when you are exclusively concerned about your messages' content but not about the metadata. As we all know, though, the metadata is the valuable stuff.
There is a second reason it is wrong, though: These days, lots of actual user data (i.e. != metadata) gets uploaded to the Signal servers[0] and encrypted with the user's Signal PIN (modulo some key derivation function). Unfortunately, many users choose an insecure PIN, not a passphrase with lots of entropy, so the derived encryption key isn't particularly strong. (IMO it doesn't help that it's called a PIN. They should rather call it "ultra-secure master passphrase".) This is where a technology called Intel SGX comes into play: It provides remote attestation that the code running on the servers is the real deal, i.e. the trusted and verified code, and not the code with the added backdoor. So yes, the server code does need to be published and verified.
Finally, let's not forget the fact that SGX doesn't seem particularly secure, either[1], so it's even more important that the Signal developers be open about the server code.
[0]: https://signal.org/blog/secure-value-recovery/
[1]: https://blog.cryptographyengineering.com/2020/07/10/a-few-th...
Nope. It's a reaction to "who the f* asked for this in a messaging app?!".
For Android at least, builds are reproducible https://signal.org/blog/reproducible-android/ (would be neat if there was one or more third party CI's that also checked that the CI-built app reproduces the one on Google Play Store – or maybe there already are?)
Unfortunately, `rg -i SGX` only yielded the following two pieces of code:
https://github.com/signalapp/Signal-Android/blob/master/libs...
https://github.com/signalapp/Signal-Android/blob/master/libs...
No immediate sign of a fixed hash. Instead, it looks like the code only verifies the certificate chain of some signature? How does this help if we want to verify the server is running a specific version of the code and we cannot trust the certificate issuer (whether it's Intel or Signal)?
I'm probably (hopefully) wrong here, so maybe someone else who's more familiar with the code could chime in here and explain this to me? :)
While it's nice to try to have Signal be resilient to attacks by the core team, there just aren't enough community-minded independent volunteer code reviewers to reliably catch them up. I doubt the signal foundation gets any significant volunteer efforts, even by programmers who aren't security experts.
That means I need to decide if I trust the Signal Foundation. Shilling sketchy cryptocurrencies is indicative of loose morals, which makes me think I was wrong to trust them in the past.
For notifications the alternatives are noticably worse (higher battery usage because you can't coordinate request timings with other apps, an annoying permanent notification), and the leakage is minimal. If you protect your encrypted packets from Google the NSA will see them anyway.
Your custom implementation will be quite complicated, and if you only enable it for a small subset of your users it'll be a pain to debug.
These are "valid" reasons for keeping the source code private for a year? By whose book? Yours? Certainly not by mine. I wouldn't let any other business abscond from its promise to keep open source open source in spirit and practice, why would I let Signal?
This is some underhanded, sneaky maneuvering I'm more used to seeing from the Amazons and the Facebooks of the world. These are not the actions of an ethically Good organization. And as has already been demonstrated by Moxie in his lust to power, he's more than capable of deviance. On Wire vs Signal: "He claimed that we had copied his work and demanded that we either recreate it without looking at his code, or take a license from him and add his copyright header to our code. We explained that we have not copied his work. His behavior was concerning and went beyond a reasonable business exchange — he claimed to have recorded a phone call with me without my knowledge or consent, and he threatened to go public with information about alleged vulnerabilities in Wire’s implementation that he refused to identify." [1]
These are not the machinations of the crypto-idealist, scrappy underdog for justice we are painted by such publications as the New Yorker. This is some straight up cartoon villain twirling their moustache plotting.
So now I'm being sold on a business vision that was just so hot the public's eyes couldn't bear it? We're talking about a pre-mined cryptocurrency that its inventors are laughing themselves to the bank with.
At least Pavel Durov of Telegram is honest with his users. At least we have Element doing their work in the open for all to see with the Matrix protocol. There are better, more ethical, less shady organizations out there who we can and ought to be putting our trust in, not this freakshow of a morally-compromised shamble.
[1] https://medium.com/@wireapp/axolotl-and-proteus-788519b186a7
> "the bulk of users is either too stupid or unwilling to invest even the tiniest amount of effort into their privacy."
Based on my interactions with users writ large, this assessment is on the money. Normies do not care and will never care.
If I understand what you are saying and what Signal says, Signal anticipates this problem and provides a solution that is arguably optimal:
https://signal.org/blog/secure-value-recovery/
My (limited) understanding is that the master key consists of the user PIN plus c2, a 256 bit code generated by a secure RNG, and that the Signal client uses a key derivation function to maximize the master key's entropy. c2 is stored in SGX on Signal's servers. If the user PIN is sufficiently secure, c2's security won't matter - an attacker with c2 still can't bypass the PIN. If the PIN is not sufficiently secure, as often happens, c2 stored in SGX might be the most secure way to augment it while still making the the data recoverable.
I'd love to hear from a security specialist regarding this scheme. I'm not one and I had only limited time to study the link above.
I agree about the sorry state of non-Google notifications on Android. I wish someone would make a common notification framework for the Free world that would be installed alongside system-level F-Droid. Although F-Droid Conversations and Element notifications do work fine for me, regardless of purportedly less battery life, I can understand not everyone wants to make the same choice.
However, I'm referencing more than the notifications issue. I recall an early thread from Signal where they touted the benefits of fully opting into the Google ecosystem - the gist was that Google has expended all of this effort on security and they wanted to take advantage of it to bring security to the masses. And that simply doesn't line up with my own threat model, in which Google is one of the most significant attackers.
Yep, this is what I meant when I said "This is where a technology called Intel SGX comes into play". :)
And you're right, SGX is better than nothing if you accept that people use insecure PINs. My argument mainly was that
- the UI is designed in the worst possible way and actually encourages people to choose a short insecure PIN instead of recommending a longer one. This means that security guarantees suddenly rest entirely on SGX.
- SGX requires the server code to be verified and published (which it wasn't until yesterday). Without verification, it's all pointless.
> uses a key derivation function to maximize the master key's entropy
Nitpick: Technically, the KDF is deterministic, so it cannot change the entropy and – as the article says – you could still brute-force short PINs (if it weren't for SGX).
> I'd love to hear from a security specialist regarding this scheme. I'm not one and I had only limited time to study the link above.
Have a look at link [1] in my previous comment. :)
You make it sound as if that's a bad thing. Reflexes are beneficial when the treats are real. And the crypto“currency” multi-level marketing pyramid schemes are very real. They do nothing but induce greed, gambling, spam, and all-around toxic behavior. It's a digital cancer that needs to end.
Who decided it was sketchy?
The "I don't like change so I'm going to piss all over you" attitude is what sinks a lot good things.
How does Signal benefit from being a shill for this coin? Are they being paid by MOB or do they get a % of the cut?
So far all I've read are people screaming their heads off that MOB eats babies and how dare Signal stoop so low as to even fart in their general direction, but I have yet to see anyone explain why MOB is bad or how Signal is bad for giving MOB a platform.
When people evaluate a new messaging client, the minimum feature set required to be considered viable now includes sending money for a lot of the population.
* removed insult
The CEO of signal messenger LLC was/is the CTO of MOB.
See https://www.reddit.com/r/signal/comments/mm6nad/bought_mobil... and https://www.wired.com/story/signal-mobilecoin-payments-messa...
The contrarian dynamic strikes again: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...
But, a good-faith operator can find and fix bugs faster if they operate in the open and in collaboration with the community. "Given enough eyeballs, all bugs are shallow" etc.
Btw, the Signal Foundation is a non-profit organization that benefits from community goodwill based on an open-source ethos. So people are critical when its software is closed source.
By incorporating cryptocurrency/payments, governments are being handed a massive lever to force Signal to comply with the financial monitoring requirements that governments have in place.
This has a negative impact on those of us who just wanted a secure communications platform.
What exactly do they rely on google for? They use them for their push notifications and they use some google servers on the back end.
They do offer the app on the app store as 99% of android users get their apps that way, but signal also offers app downloads from the signal website if the user doesn't want to use play store.
Wrong. Availability is a core component of security. Keeping your server implementation closed and only available from one single entity gives signal a failing grade in my book.
Signal is great for one-off messaging but is a poor long-term solution for secure instant messaging.
Certain people on their team don't like the PGP standard despite the fact that it is mature, standardized, and proven to work well for code signing. When questioned about their reasoning, they'd usually deflect and criticize some aspect of PGP that is irrelevant to code signing at all.
In their minds, they believe it is better to rely on git's broken SHA1 fingerprints than to use PGP.
They don't owe me anything but I think it's a shame that the leading open source messenger app does such a poor job of communicating with its users and the larger open source community.
So if major players jump off a cliff, everyone should always follow?
> the minimum feature set required to be considered viable now includes sending money for a lot of the population.
? No, thank you. Not where the actual banking system is working.
You are free to examine the source of theirs (if they choose to continue releasing it), but you cannot self-host.
So you would have to then follow the above steps for any contacts you want to communicate with, distributing your own client to them. Signal devs have generally been extremely hostile toward anyone wishing to do this however.
The only way out of this situation would be if the Signal project itself was forked and people moved to that forked open-source multi-server client.
But in Edit 4 of the Reddit post it says
> Also, the extended whitepaper wrongly cites Moxie as chief technology officer, while he is the technical advisor.
On top of that, this whitepaper that's being passed around is a forgery with 1.5 pages of factually incorrect information.
The Wired article you link isn't even really critical, it just matter-of-factly explains the feature and it's back story.
> Signal's choice of MobileCoin is no surprise for anyone watching the cryptocurrency's development since it launched in late 2017. Marlinspike has served as a paid technical adviser for the project since its inception, and he's worked with Goldbard to design MobileCoin's mechanics with a possible future integration into apps like Signal in mind. (Marlinspike notes, however, that neither he nor Signal own any MobileCoins.)
It seems like you're trying conclusions based on lies you read 2nd or 3rd hand and didn't bother to verify. The reddit post you linked to would be received very differently depending on when you read it since it's been edited numerous times with addendums refuting earlier claims. Only by reading from beginning to end with all edits can you start to see a clear picture. Even then the picture I see is someone backpedaling a lot of false claims they made.
Everyone else is trying to pass off false information, doctored white papers, and all sort of conspiracy theories to support their dislike for the feature.
There are some links there to other pieces if you want to read more about it.
> for sure doesn't count as basis for what you are entitled for
I'm not claiming that moral authority flows from the Gnu brand; rather, they provide some information and reasoning which people can use to come to their own conclusions.
If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future. They're here: https://news.ycombinator.com/newsguidelines.html.
We detached this subthread from https://news.ycombinator.com/item?id=26727160.
SGX running on centralized servers turns that calculus on it's head by concentrating the benefits of the hack all in one place.
During remote attestation, the prover (here, Signal's server) create a "quote" that proves it is running a genuine enclave. The quote also includes the MRENCLAVE value.
It sends the quote to the verifier (here, Signal-Andriod), which in turn sends it to Intel Attestation Service (IAS). IAS verifies the quote, then signs the content of the quote, thus signing the MRENCLAVE value. The digital signature is sent back to the verifier.
Assuming that the verifier trusts IAS's public key (e.g., through a certificate), it can verify the digital signature, thus trust the MRENCLAVE value is valid.
The code where the verifier is verifying the IAS signature is here: https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...
The code where the MRENCLAVE value is checked is here: https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...
Hope this helps!
What I mean is: if Signal is not Elment.io/matrix, and that the latter is better for freedom and openness, then one can agree with with that. But what I don't understand is the demand from people that Signal somehow owes them the ability to be like matrix, be federated, etc. and also be so judgemental about it, is what rubs me the wrong way.
It's ok to think that in an ideal world it would be like that, but argumenting as if you were entitled to the source because of it doesn't seem that it will persuade others. After all, if you aren't empathetic to the reality, how would you expect others be empathetic to you?
Let's say we have a Signal-Android client C, and the Signal developers are running two Signal servers A and a B.
Suppose server A is running a publicly verified version of Signal-Server inside an SGX enclave, i.e. the source code is available on GitHub and has been audited, and server B is a rogue server, running a version of Signal-Server that comes with a backdoor. Server B is not running inside an SGX enclave but since it was set up by the Signal developers (or they were forced to do so) it does have the Signal TLS certificates needed to impersonate a legitimate Signal server (leaving aside SGX for a second). To simplify things, let's assume both servers' IPs are hard-coded in the Signal app and the client simply picks one at random.
Now suppose C connects to B to store its c2 value[0] and expects the server to return a remote attestation signature along with the response. What is stopping server B then from forwarding the client's request to A (in its original, encrypted and signed form), taking A's response (including the remote attestation signature) and sending it back to C? That way, server B could get its hands on the crucial secret value c2 and, as a consequence, later on brute-force the client's Signal PIN, without C ever noticing that B is not running the verified version of Signal-Server.
What am I missing here?
Obviously, Signal's cloud infrastructure is much more complicated than that, see [0], so the above example has to be adapted accordingly. In particular, according to the blog post, clients do remote attestation with certain "frontend servers" and behind the frontend servers there are a number of Raft nodes and they all do remote attestation with one another. So the real-life scenario would be a bit more complicated but I wanted to keep it simple. The point, in any case, is this: Since the Signal developers are in possession of all relevant TLS certificates and are also in control of the infrastructure, they can always MITM any of their legitimate endpoints (where the incoming TLS requests from clients get decrypted) and put a rogue server in between.
One possible way out might be to generate the TLS keys inside the SGX enclave, extract the public key through some public interface while keeping the private key in the encrypted RAM. This way, the public key can still be baked into the client apps but the private key cannot be used for attacks like the one above. However, for this the clients would once again need to know the code running on the servers and do remote attestation, which brings us back to my previous question – where in Signal-Android is that hash of the server code[1]?
[0]: https://signal.org/blog/secure-value-recovery/
[1]: More precisely, the code of the frontend enclave, since the blog post[0] states that its the frontend servers that clients do the TLS handshake with:
> We also wanted to offload the client handshake and request validation process to stateless frontend enclaves that are designed to be disposable.
[0]: https://sgx101.gitbook.io/sgx101/sgx-bootstrap/attestation#r...
[1]: For other people interested in this matter: I've followed the path of the MRENCLAVE variable to the very end,
https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...
where it gets injected by the build config. The build config, in turn, is available here:
https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...
(The MRENCLAVE values can be found around line 120.)
I doubt that many people rebuild the app at each update to check that the new binaries match the ones provided by their store. If, for example, the PlayStore distributed at large a binary that doesn't match the published sources, some dedicated user would probably spot the issue.
However, the PlayStore (and Signal, but it's not even necessary for the following) being under US jurisdiction, any user not checking each update it receives is vulnerable to the NSL + gag order famous combo in case of a targeted attack. I recognize that this is probably something that most people do not include in their threat model but I'm still a bit dubious about the fact that convenience related to release management and not having to worry about interoperability is worth accepting the risks linked to a unique delivery channel, especially for what could (and is widely thought to) be a completely secure IM solution. "Almost secure" is frighteningly the worse obstacle to "secure"...
I'm admittedly biased since I'm convinced that federation, multiple client/server implementations and multiple distribution channels are a requirement for a secure IM infrastructure (which is why my heart goes to Matrix nowadays).
https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...
For completeness, let's also have a look at where the CA certificate(s) come(s) from. The PKIX parameters[1] are retrieved from the trustStore aka iasKeyStore which, as we follow the rabbit hole back up, gets instantiated here:
https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...
As we can see, the input data comes from
https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...
`R.raw.ias` in line 20 refers to the file `app/src/main/res/raw/ias.store` in the repository and as we can see from line 25 it's encrypted with the password "whisper" (which seems weird but it looks like this a requirement[2] of the API). I don't have time to look at the file right now but it will probably (hopefully) contain only[3] Intel's CA certificate and not an even broader one. At least this is somewhat suggested by the link I posted earlier:
https://github.com/signalapp/Signal-Android/blob/master/libs...
In any case, it seems clear that the IAS's certificate itself doesn't get pinned. Not that it really matters at this point. Whether the certificate gets pinned or not, an attacker only needs access to the IAS server, anyway, to steal its private key. Then again, trusting a CA (and thus any certificate derived from it) obviously widens the attack vector. OTOH it might be that Intel is running a large array of IAS servers that come and go and there is no guarantee on Intel's part that a pinned certificate will still be valid tomorrow. In this case, the Signal developers obviously can't do anything about that.
[0]: https://twitter.com/matthew_d_green/status/13802817973139742...
[1]: https://docs.oracle.com/javase/8/docs/api/index.html?java/se...
[2]: https://stackoverflow.com/questions/4065379/how-to-create-a-...
[3]: If it contains multiple CA certificates, each one of them will be trusted, compare [1].
It isn't about cryptocurrency at all. It's about trust.
The relationship between mobilecoin and signal isn't even clear as far as I can tell. You're just asked to look at an obvious cash grab and be willing to accept it as normal operating procedure, and then expected to trust your most private communications to the same individuals that won't reveal their true intentions.
I don't think anyone's "demanding" or "forcing" anything here. We're simply describing a definition of what we consider desirable as a sustainable secure messaging option, and pointing out the specific reasons that Signal isn't currently living up to that definition.
It's maintainers are free to continue on their way ignoring said definition.
Personally, my own comments are not targeted at Signal devs but rather at others who might consider using Signal thinking it provides certain guarantees when it doesn't.