The only reason this didn't turn into a disaster was pure luck.
I think F-Droid were acting in the right up to that point; and then the latest update (regex newlines) is 0day? Has there been a response from F-Droid about the updates?
Is it? Or is it a case of "It rather involved being on the other side of this airtight hatchway"[1]? The apk signature done by fdroidserver seems totally superfluous. Android is already going to verify the certificate if you try to update an app, and presumably whatever upload mechanism is already authenticated some other way (eg. api token or username/password), so it's unclear what the signature validation adds, aside from maybe preventing installation failures.
[1] https://devblogs.microsoft.com/oldnewthing/20060508-22/?p=31...
But yeah other repos don't :(
I often wonder how secure these open source projects actually are. I'm curious about using Waydroid in SteamOS, but it looks like it only runs LineageOS (apparently a derivative of CyanogenMod).
I know that people claim that open source is more secure because anyone can audit it, but I wonder how closely its security actually interrogated. Seems like it could be a massive instance of the bystander effect.
All of it gives me a bias towards using official sources from companies like Apple and Google, who presumably hire the talent and institute the processes to do things right. And in any case, having years/decades of popularity is its own form of security. You know anyone who cares has already taken shots at Android and iOS, and they're still standing.
LineageOS is popular in this field because in essence it's a derivative of AOSP (the Android project as shipped by Google) with modest modifications to support a crapload of devices, instead of the handful that AOSP supports. This makes it easier to build and easier to support new platforms.
The bulk of the security in AOSP (and thus, LineageOS) comes from all the mitigations that are already built into the system by Google, and the bulk of the core system that goes unmodified. The biggest issue is usually the kernel, which may go unpatched when the manufacturer abandons it (just like the rest of the manufacturer's ROM), and porting all the kernel modifications to newer versions is often incredibly tricky.
Will it if it's a non Google distro of Android?
The answer is that, no, nobody akshuarry audits anything. This has been proven time and time again, especially in the last few years.
>All of it gives me a bias towards using official sources from companies like Apple and Google, who presumably hire the talent and institute the processes to do things right.
What you get from commercial vendors is liability, you get to demand they take responsibility because you paid them cold hard cash. Free products have no such guarantees, you are your own liability.
Android is extremely complex so I think many of the custom ROMs possibly have some security rookie mistakes and quite a bit security bugs due to mishmash of drivers. Android is still better than most of the Linux distros due to its architecture though. The default setup of many distros doesn't have much isolation if at all.
The use of AllowedAPKSigningKeys afaik is to compare that key with the key used for signing the dev build. If its not the same, the dev build is rejected.
From what I've understood from this POC, its possible to bypass this signature check. The only exploit I can think of with this bypass is that someone who gets access to the developer's release channel can host their own signed apk, which will either get rejected by Android in case of update (signature mismatch) or gets installed in case of first install. But in either case, its still the same reproducible build, only the signature is different.
https://www.opentech.fund/security-safety-audits/f-droid/
https://f-droid.org/2018/09/04/second-security-audit-results...
https://f-droid.org/2022/12/22/third-audit-results.html
I was involved in addressing in issues identified in the first one in 2015. It was a great experience, much more thorough than the usual "numerous static analysers and a 100 page PDF full of false positives that you often receive.
Edit: on second thought, they could pin certificate hashes like F-Droid does on the build server, but verify them client-side instead. If implemented correctly this could indeed work. However, I think F-Droid with reproducible builds is still a safer bet, as attacker would have to get write access to source repo as well and hide their malicious code so that F-Droid can build and verify it.
If you try to update the app. Anyone installing the app from scratch will still be vulnerable. Effectively, both cases are Trust On First Use, but AllowedAPKSigningKeys moves the First Use boundary from "the first time you install the app" to "the first time F-Droid saw the app". Izzy wrote a blog post about it a while ago.[0]
> and presumably whatever upload mechanism is already authenticated some other way (eg. api token or username/password)
IzzyOnDroid (and, I believe, F-Droid) don't have their own upload UI or authentication, they poll the upstream repo periodically.
[0]: https://f-droid.org/2023/09/03/reproducible-builds-signing-k...
I would easily believe that many Android systems have vulnerabilities owing to the horrific mess that is their kernel situation. That said, I personally doubt that aftermarket ROMs are worse than stock, as official ROMs are also running hacked up kernels.
Sooo how about the audits linked in >>42592444 ?
> Instead of adopting the fixes we proposed, F-Droid wrote and merged their own patch [10], ignoring repeated warnings it had significant flaws (including an incorrect implementation of v1 signature verification and making it impossible to have APKs with rotated keys in a repository).
This concerns me more than the vulnerabilities themselves. It's a pretty serious failure in leadership and shows that F-Droid is still driven by egos, not sound software engineering practices and a genuine interest in doing right for the community.
F-Droid has numerous issues:
* glacially slow to release updates even when security patches are released
* not enforcing 2FA for developer accounts
* no automatic vulnerability or malware scanning
...and more problems: https://privsec.dev/posts/android/f-droid-security-issues/
Do you mean OEM drivers or the Android Kernel, specifically?
Google invests quite a bit on hardening the (Android Commons) Kernel including compile-time/link-time & runtime mitigations (both in hardware & software).
Ex: https://android-developers.googleblog.com/2018/10/control-fl...
1. User downloads an app from F-Droid that supports reproducible builds.
2. The developer's account is compromised and submits an app with a different-than-expected signing key.
3. A new user installs the app (existing users aren't affected due to Android's enforcement of using the same signing key for updates).
4. This user is (external to the app) contacted by the attacker and directed to install an update to the app from them. The update contains malicious code.
F-Droid's response is concerning but this attack scenario seems pretty unlikely to work in practice.
So, IMO we should not fall into that trap of immediately removing apps that had a security flaw and falling back to a way worse alternative (which sideloading is) instead.
"Security researchers" IMO are the most cringe and worst examples of community members possible. They do not care about making things better, they only care about their own brand. Selling themselves, and climbing the ladder of embarrassed hard working people who do things for the love of doing.
Which key(s) is it signed with? What is the hash of the corresponding unsigned artifact?
Signature verification tools should have some option which prints these things in a machine-readable format.
I did some work on reproducibility of Android apps and system images with Nix, and while defining a build step which can automatically establish these relationships sounds a bit goofy, it can make the issues with underspecified edge cases visible by defining verification more strictly. I did not do this to look for those edge cases though.
I am still working on that type of stuff now, but on more fundamental issues of trust we could start addressing with systems like Nix.
i still believe "pgp is too complex" was the most successful cia counter action after they lost the crypto wars to the people.
solving via nix only works within the flawed assumptions that end users either fully trust google or fdroid and are incapable of anything else.
Has been dead for 8+ years. LineageOS is its own thing by now.
> anyone who cares has already taken shots at Android and iOS
LineageOS is based on AOSP, plus some modifications that do not affect security negatively.
(I am just trying to push the visibility of your comment ;) )
PGP is too complex. I've known my way around the command line before I learned how to hand-write, and I have to look up the commands to fetch the keys and/or verify the blob every single time. Keyservers regularly fail to respond. There's no desktop integration to speak of. The entire UX stinks of XKCD 196.
Don't blame CIA for obvious deficiencies in usability.
Are you suggesting that ROMs provided through Android Studio's emulator are somehow not built by Google?
That still enables a supply chain attack, which should not be dismissed - virtually all modern targeted attacks involve some complex chain of exploits; a sufficiently motivated attacker will use this.
1. What you're describing would have to happen on the f-droid app, but the vulnerability seems to be on fdroidserver?
2. Even if this actually affected the f-droid app, what you described seems like a very modest increase in security. The attack this prevents (ie. a compromised server serving a backdoored apk with a different signature) would also raise all kinds of alarms from people who already have the app installed, so practically such an attack would be discovered relatively quickly.
>IzzyOnDroid (and, I believe, F-Droid) don't have their own upload UI or authentication, they poll the upstream repo periodically.
Doesn't f-droid perform the build themselves and sign the apk using their own keys? They might be pulling from the upstream repo, but that's in source form, and before apks are signed, so it's irrelevant.
>But in either case, its still the same reproducible build, only the signature is different.
That means the attacker still has to compromise the source repo. If they don't and try to upload a backdoored apk, that would cause a mismatch with the reproducible build and be rejected. If you can compromise the source repo, you're already screwed regardless. Apk signature checks can't protect you against that.
I'm not saying I have evidence that this happened to PGP specifically, just that it doesn't seem at all implausible. If the CIA told me my code was never to get too easy to use, but otherwise I could live a long and happy life and maybe a couple of government contracts it would be hard to argue.
Why a mass-market interface never took off (GPG and other descendants notwithstanding) may indicate that the whole cryptographic idea is inherently not amenable to user-friendliness, but I don't find that hypothesis as compelling.
(It could also be an unlikely coincidence that there's a good solution not found for lack of looking, but that's even less plausible to me.)
Thanks to the efforts of Google to "simplify" smartphones the average young person now couldn't find and double-click a downloaded file if their life depended on it.
In the US, a manual car is considered an anti-theft device. In Europe, basically everyone that isn't obscenely rich has driven a manual car at some point.
People learn what they're expected to learn.
signify[1] is approachable at least for the power users - I could print out that man page on a T-shirt. HTTPS is ubiquitous and easy, thanks to ACME & Let's Encrypt. E2EE with optional identity verification is offered in mainstream chat apps.
And of course there are usability improvements to GPG, being made by third parties: Debian introduced package verification a couple decades ago, Github does commit verification, etc. What's to stop e.g. Nautilus or Dolphin from introducing similar features?
As contributors, we enjoy a lot of trust, as we should. That's why it's not a problem if we make seemingly random changes that don't necessarily make a lot of sense, but seem relevant to security, when they actually fix an issue in the code. After all, it's necessary to prevent bad guys from gaining sensitive information, and to keep your colleagues from being unduly bothered with challenges they could possibly help with.
I have no doubt that this is true, but I very much question whether any alternate UX would solve this problem for you, because the arguments for these two tasks are given very obvious names: `gpg --receive-keys <keyIDs>` and `gpg --verify <sigfile>`. There's no real way to make it easier than that, you just have to use it more.
The tool also accepts abbreviations of commands to make things easier, i.e. you could also just blindly type `gpg --receive <keyID>` and it would just work.
I think you are right that UI sucks in many cases, but I think its not intrinsic to PGP - its fixable.
Like you said people learn what they're expected to learn.
It depends on the software. Something widely used and critical to people who are willing to put resources in is a lot more likely to be audited. Something that can be audited has got to be better than something that cannot be.
> All of it gives me a bias towards using official sources from companies like Apple and Google, who presumably hire the talent and institute the processes to do things right.
I am not entirely convinced about that, given the number of instances we have of well funded companies not doing it right.
> You know anyone who cares has already taken shots at Android and iOS, and they're still standing.
There has been quite a lot of mobile malware and security issues, and malicious apps in app stores. Being more locked down eliminates some things (e.g. phishing to install malware) but they are far from perfect.
However Whatsapp/signal show how e2e can be done in a user-compatible way. By default it simply exchanges keys and shows a warning when key is changed and those who need/want can verify identity.
Missing there of course openness.
I wonder why there aren't more, but there are some, for example Proton's efforts towards encrypted email.
https://proton.me/support/how-to-use-pgp
(I won't mention the relative shortcomings of HTTPS and E2E chat apps here.)
Which post are you talking about? >>42592150 was made by FuturisticGoo, not me.
Also, the wording on f-droid suggests the version that f-droid hosts is built by them, rather than a version that's uploaded by the dev. If you go on any app and check the download section, it says
> It is built by F-Droid and guaranteed to correspond to this source tarball.
So the rest are actually OK with Whatsapp/Signal having the opportunity to see their messages? I would submit that most are not even aware of the issue...
The identity thing is basically the usability issue for E2EE messaging. If you don't solve that then you have not actually increased usability in a meaningful way. The PGP community understood this and did things like organize key signing parties. When is the last time anyone did anything like that for any popular E2EE capable instant messenger?
It also allows the user to place a little less trust on F-Droid because the developer, as well as F-Droid, must confirm any release before it can be distributed. (Now that I think of it, that probably creates an issue where if malware somehow slips in, F-Droid has no power to remove it via an automatic update. Perhaps they should have a malware response or notification system?)
More: https://f-droid.org/2023/09/03/reproducible-builds-signing-k...
The UI still sucks, though, because people ask me what the .ASC attachments sent with all of my emails are and if I've been hacked. When I explain that's for encryption, they may ask how to set that up on their phones if they care, but most of them just look at me funny.
I do use email encryption at my job, through S/MIME, and that works fine. Encryption doesn't need terrible UI, but PGP needs support from major apps (including webmail) for it to gain any traction beyond reporting bug bounties.
Signature verification tools on the command line do not surface enough information to make it easy for their users keep track of what the unsigned input was.
I don't think their users are "end users" though. I am concerned about having better UX and making it more accessible to check these things, but for very advanced users, developers and security professionals. I think surfacing this to end users might come a few steps further down that road, but I am not thinking about that yet. I guess that's why you're talking about trust in google or f-droid, because you're thinking about end users already.
For now at least professionals should have an easy time keeping track of what the corresponding unsigned artifact to a signed artifact is, and we are far away from that right now. You have to write code for that, or inspect the binary formats of those signed and unsigned artifacts. That's not good enough. If that code is part of the tool in the first place, that automatically means that the semantics of the signature are much more well defined.
As far as I understand (I'm not an expert on F-Droid), this validation happens on the server side. The (repo) server verifies that the signature matches that of the first version it saw, the phone (when installing the APK) verifies that the signature matches that of the first version it saw.
Android keeps the fdroidserver honest for upgrades, fdroidserver provides an additional bootstrap point for Android's trust.
> 2. Even if this actually affected the f-droid app, what you described seems like a very modest increase in security. The attack this prevents (ie. a compromised server serving a backdoored apk with a different signature) would also raise all kinds of alarms from people who already have the app installed, so practically such an attack would be discovered relatively quickly.
Sure, it's the difference between "automated tooling sees the problem immediately and addresses it proactively" vs "hopefully someone will ring the alarm bell eventually".
> Doesn't f-droid perform the build themselves and sign the apk using their own keys? They might be pulling from the upstream repo, but that's in source form, and before apks are signed, so it's irrelevant.
According to the linked blog post, not anymore. Apparently, these days they serve the author's original APK, but after verifying that they can rebuild it (modulo the signature itself).
If we accept that the world has moved to webmail, and use a GUI client, then the way to make it easier is bake in into the client and make it seamless so there's no manual futzing with anything. Make it like TLS certs, so there's a padlock icon for encrypted mail, yellow for insecure, and mail that fails validation gets a big red warning.
Unfortunately, purists in the community could not accept that, so it's never happened, and so gpg failed to get critical mass before alternatives popped up.
with that stigma no company invested in that that entire space for decades! we are still gluing scraps from Canadian phds when it comes to pgp UX.
now that crypto is cool you will get keypass, which is the obvious evolution of "url padlock". either the login button is enabled or not. don't question whats happening behind the curtain.
... the fact this entire comment thread is mixing my loose points about the url padlock (consequence) with the CIA actions on pgp (cause)... sigh. I won't bother anymore. enjoy the bliss.
Do you talk to non-technical people? Some people can hardly turn their computer on. Do you really think PGP is in their grasp?
My father, a farmer type born in 1935, managed to use it easily enough when shown how.
It was typical enough of the tools of the time.
34 years ago the average person did not own a computer. What was computer ownership at in 1990, 10%? The people who owned computers tended to be wealthy, smart or hobbyists which isn't exactly indicative of the average person.
So, your father, who has has somebody who can walk him through it can figure it out. Well guess what, the average person doesn't have a technologically knowledgeable child to show it to them.
Perhaps you have a literate child that might explain context.
Encrypted email is near useless. The metadata (subject, participants, etc) is unencrypted, and often as important as the content itself. There are no ephemeral keys, because the protocol doesn't support it (it's crudely bolted on top of SMTP and optionally MIME). Key exchange is manual and a nuisance few will bother with, and only the most dedicated will rotate their keys regularly. It leaves key custody/management to the user: if there was anything good about the cryptocurrency bubble, it's that it proved that this is NOT something you can trust an average person with.
Signed email is also hard to use securely: unless the sender bothered to re-include all relevant metadata in the message body, someone else can just copy-paste the message content and use it out of context (as long as they can fake the sender header). It's also trivial to mount an invisible salamanders attack (the server needs to cooperate).
The golden standard of E2EE UX are Signal, iMessage, and WhatsApp; all the details of signing and encryption are invisible. Anything less is insecure - because if security is optional or difficult, people will gravitate towards the easy path.
The only use-case I have for PGP is verifying the integrity of downloads, but with ubiquitous HTTPS it's just easier to run sha256sum and trust the hash that was published on the website. The chain of trust is more complicated and centralised (involves CAs and browser vendors), but the UX is simpler, and therefore it does a better job.
But obviously only for newly submitted apps that can be built reproducibly. Old apps, or even new apps that fail to build reproducibly, continue to be built to the old model with an F-Droid-provided signing key.
(Even if an older app can actually be built reproducibly these days, it'll still be stuck on having to continue using the F-Droid signing key – key rotation is only supported from Android 9 onwards, and for some reason Google only recommends its actual usage on Android 11 and newer, and then you still need to manage the transition period somehow where app updates need to be dual signed, and once you stop that dual-signing, everybody updating from an old signature needs to install one of those dual-signed versions first before being able to install a version only signed with the new key.)