F-Droid vulnerability allows bypassing certificate pinning
One more note: I have edited the title of this post to be more descriptive and matter-of-fact rather than sensationalist. I'm hoping that the posts in this thread will reflect that direction.
matchboxbananasynergy keep it classy
Apologies if my response seemed intense. Not intended.
The stark truth is that people learn how to install, GOS, the most secure mobile OS in the world and then hit a wall when starting to install apps. I've been on GOS for 5 years across several phones as an amateur. The entire time it has been Github Dev crowd poking at FDroid and FDroid supporters poking back.
It makes no sense that this has gone on this long. Much appreciated that the GrapheneOS team has endorsed Accrescent. It is just frustrating that this kind of effort took this long and is moving so slowly.
As near as I can tell, Accrescent, Appverifier are nearly solo projects just like Divest which has ended. Are they sustainable?
Great that a credible researcher has found an issue. However, will chain of trust be solved any time soon or will there just be endless poking at each other?
Could more people with skills actually help out?
sky9000 it's not poking for no reason. They seemingly have a systematic issue with their security architecture which needs fixing, but hasn't been addressed for years (see my response above). Every passing day something new comes up, something where you can only shake your head in disbelief. I wholeheartedly agree that there should be a trusted app store, however; the only people seemingly capable of such a feat are drowning in work.
I was a big proponent of F-Droid until I did my research and found the gaping holes that plague it to this day. Now I recommend people to use different clients and disable their repo. And this wouldn't be the case if they just FINALLY fixed their problems!
missing-root F-Droid has had major unaddressed security issues for years which have been repeatedly raised. The development team have demonstrated an extremely anti-security attitude. They disregard basic security best practices for Android development and are highly resistant to improving it. This is one symptom of the poor security of F-Droid rather than even being one of the major issues with it. The entire approach of automatically fetching and building apps in an outdated environment on outdated, poorly maintained infrastructure which are then signed with their own keys while using the official app ids is horrible. The whole thing needs to be thrown out and replaced. They also don't follow the security model for app sources, don't support key rotation and many other issues. F-Droid causes major usability issues with profiles due to not following Android development best practices, which has put a substantial support burden on us. Steering people away from it is important to avoiding them having a bad experience using profiles due to their misuse of app ids not belonging to them.
It is easy to take shots at FDroid but they are the ones attempting heaving lifting with chain of trust.
This is complete nonsense. They're not doing any heavy lifting and are making the situation much worse. They've introduced major usability issues and confusion by reusing app ids with their own signing keys. This does not provide a chain of trust but rather breaks it since the apps are not signed by the developers and securely provided to you but rather built on insecure server infrastructure but quite clearly untrustworthy people who do not care about security in the slightest but rather only going through the motions of pretending they do.
The stark truth is that people learn how to install, GOS, the most secure mobile OS in the world and then hit a wall when starting to install apps.
The main way to get Android apps is the Play Store and it works fine. A bunch of other options are available and it's not a GrapheneOS issue that those are quite fragmented.
The entire time it has been Github Dev crowd poking at FDroid and FDroid supporters poking back.
F-Droid is incredibly poorly designed and maintained. It's a complete security disaster. It's a usability disaster. It has never been a good option for obtaining apps and was always preventing progress in the space due to people focusing on it and not making something much better.
F-Droid automatically downloads and builds code, so despite their false marketing it does not protect users from the developers in any real way. It often adds substantial delays for updates including security patches. Their builds have often rolled back dependency versions, signing scheme version and the SDK to much older versions with security flaws. They've consistently introduced security flaws to the apps. It adds additional trusted parties who have demonstrated a clear lack of trustworthiness including several of their core team members spreading fabricated stories and repeatedly engaging in cover ups. In what sense is this a safe way to get apps?
It makes no sense that this has gone on this long. Much appreciated that the GrapheneOS team has endorsed Accrescent. It is just frustrating that this kind of effort took this long and is moving so slowly.
F-Droid has massively hindered progress because it got the mind share of a lot of the open source community before people started realizing how awful it is and that it's not ever going to become a great platform. The people developing it are incapable of making good software, do not like the overall platform, do not understand it and don't truly want it to succeed. The main developer has repeatedly made statements against app sandboxing and other basic tenets of privacy and security.
As near as I can tell, Accrescent, Appverifier are nearly solo projects just like Divest which has ended. Are they sustainable?
What makes you assume F-Droid is sustainable? F-Droid has severe security flaws throughout their app, repository, infrastructure for building/signing apps and servers. These have been repeatedly pointed out and the remaining team has focused on covering up issues, misdirecting from it and making excuses for it. F-Droid also causes major usability issues with Android profiles due to them reusing app ids with different signing keys, which they refuse to even acknowledge is against the basic best practices for Android development. They also still publish an outdated F-Droid version on their site as the main download option, causing more usability issues and also not giving people the latest security fixes from the start.
Great that a credible researcher has found an issue. However, will chain of trust be solved any time soon or will there just be endless poking at each other?
If you continue to misrepresent, downplay and deny the many real issues about F-Droid, you'll no longer be participating in our community.
Could more people with skills actually help out?
F-Droid is the main barrier to getting well designed, secure distribution of open source and other apps fully implemented and widely adopted. It's a huge barrier to it. F-Droid is actively negligent and is putting users at risk. They've taken many anti-security positions and are against basic tenets of security. Their team has consistently attacked GrapheneOS. Helping them deters progress rather than enabling it. You portray this as if we're trying to accomplish the same thing and should be working with people involved in harassment towards our team.
I have a question that is on-topic. Can someone explain what certificate pinning it is that can be bypassed? What is the consequences of this vulnerability for the end-user?
I suppose F-Droid repository looks pretty much like on any Linux distribution, that is, the repository data your device retrieves is signed with a key controlled by F-Droid, and then each app mentioned there has a SHA256 of the APK file. My understanding was that F-Droid always retrieves source code releases for apps, and build apps themselves. Is it the source code archives that is signed in a way that can be bypassed? Or is there some other component here that works in a way not typical of how Linux app repositories are maintained?
I couldn't really understand from the vulnerability description or the linked mailing list posts, so would appreciate if someone would explain.
sky9000 Great that a credible researcher has found an issue. However, will chain of trust be solved any time soon or will there just be endless poking at each other?
Could more people with skills actually help out?
My impression is that there is absolutely zero agreement about how even an ideal app repository for GrapheneOS would look like. There just seem to be so many competing security goals, with no proposed solution solving them all.
I guess the best bet to replace the use-case of F-Droid would be for a trusted party to start downloading and compiling the source code for a very large number of apps, in a secure and modern environment, and then distribute it all like a traditional signed Linux app repository. And making local patches to the source code for an app whenever it has a too low target SDK, has trackers or ads, or other security or privacy issues. This would be close to how F-Droid is run, and would give the number of apps needed for critical mass without relying on each app developer themself submitting their app to a yet mostly unknown app repository, but could take security seriously too, in how we who use F-Droid today would want security to be.
Problem is, the amount of work to maintain an app repository, let alone one that has most modern open source apps, is a gigantic undertaking, not something a single person can really do. But if someone where to do it, I think most or all GrapheneOS users using F-Droid today as their primary or only app store would switch to it pretty much immediately.
ryrona It looks like there are multiple issues. One of the issues looks like not properly checking signatures for imported APKs where malicious APKs can be imported and end up included in the signed metadata. As far as we're concerned, the existing major design flaws and lack of trustworthiness of people with access to their critical infrastructure including the badly maintained build / signing infrastructure are a bigger issue. The status quo of what's known about F-Droid's processes, infrastructure and team is already very scary.
F-Droid doesn't provide most modern open source apps in their official repository. A huge portion of the apps it does have are completely abandoned or very poorly maintained. A huge portion fail to meet important privacy and security standards which even the very lenient Play Store implements such as a target API level requirement. It's a poor way of showing off the open source app ecosystem for Android. It provides most apps which come from particular parts of the open source community, but with delayed updates and often problematic changes including introducing security vulnerabilities via downgraded dependencies.
It's their own builds on their own infrastructure, almost entirely with their own signing keys except the small portion they use their problematic reproducible builds system. That system locks in using a problematic legacy build environment and will result in even more significantly delayed updates due to how they set it up. They make some minor, undocumented and often quite problematic changes to strip out functionality depending on Play services, etc. They certainly don't systemically update library dependencies or anything like that. They often actually downgrade dependencies. They've consistently introduced serious vulnerabilities to apps through using legacy tools, SDK, libraries, etc.
Simply building a similar kind of system following security best practices, not reusing app ids, not using a bunch of legacy software but still redoing the builds like a traditional Linux distribution be far better. There are massive problems with both their approach and implementation.
We don't consider F-Droid to be a trustworthy source of apps at all. They consistently disregard security and actively take anti-security positions. They don't care about or follow best practices for Android or security. They put on a show with the near useless shallow audits. They fundamentally don't understand or care about it.. They also consistently engage in cover ups, misdirection and misinformation as tactics to deal with security vulnerabilities and deep security design flaws throughout the software.
Several F-Droid core developers supported Copperhead's 2018 takeover attempt on GrapheneOS and then supported Copperhead's attacks over the following years. They moved on to engaging in spreading fabricated stories about our team themselves with the goal of directing harassment towards us. They supported extreme harassment from others. Nothing was done about their project members baselessly calling one of our team members insane, schizophrenic, delusional, etc. across a bunch of chat rooms and elsewhere. They made up fabricated stories about that too. It reflects on the lack of character of the overall F-Droid team. Why should GrapheneOS users trust people engaging in such underhanded tactics with the critical role of building and signing most of their apps? It's never something our community is going to broadly support even aside from the technical issues. Technical issues could theoretically be fixed (although there's no sign of much changing) but it's still not going to be run by trustworthy people.
- Edited
GrapheneOS Why should GrapheneOS users trust people engaging in such underhanded tactics with the critical role of building and signing most of their apps?
Probably because they didn't (don't) know (better).
I know I didn't. Especially what was written here is so informative. Thanks :)
I uninstalled the F-Droid Client the other day off my Pixel because of this article:
https://privsec.dev/posts/android/f-droid-security-issues/
And just now off my girlfriends Pixel.
The apps where replaced with APKs directly from the dev.
No point, for me, in running GrapheneOS because of its enhanced security and then introducing a risk over F-Droid.
I didn't have many apps from them anyway (Mullvad and IVPN).
Only to avoid the PlayStore.
Just looking into Obtainium and Accrescenct and AppVeryfier.
Thanks guys, always a pleasure to educate oneself here and implement then at home :)
- Edited
Pretty amazing work in that GitHub repo full credit to what appears a Security Researcher or Consultant doing it.
ryrona do you agree that the "xz" situation can always happen if you blindly trust app devs without reproducible builds enforced and checked for each update?
Having a security model like Accrescent, where app devs do whatever they want and the chain of trust uncontrolled starts on their desk, is problematic.
So I think that both approaches have issues. A recent, patched and secure build environment with focus on recent Android versions, is a must of course.
- Edited
ryrona I have a question that is on-topic. Can someone explain what certificate pinning it is that can be bypassed? What is the consequences of this vulnerability for the end-user?
I can't explain every angle in full detail, but this problem is affecting the fdroidserver
(which is a standalone thing), where (in theory) there should be checks against get_first_signer_certificate
. In this exploit, fdroidserver
checks the multiple signing versions/methods (v1, v2, v3) in the wrong order (Android checks them in reverse, v3, v2, v1). Usually, get_first_signer_certificate
would be the basis for subsequent app submissions, which makes sense, since the original app dev is the first one to submit an app. With the reverse checking, v2 and v3 get ignored by froidserver
and you can sign APKs with whatever certificate you like, as long as it's signed via v1. This is especially problematic for 3rd party repos where apps get grouped in the F-Droid store as an identical APK basically, where you are just one tap away from installing a possibly completely different, malicious APK.
F-Droid devs acknowledged that there would be issues with their approach (instead of Obfusk's) and in a different issue, they wrote this (pasted as spoiler for better readability):
Apps in f-droid.org could only be affected if:
The metadata uses Binaries:
The app has a minSdkVersion of 24 or higher
The APK with the bad signature took the place of the good APK at the location the URL in Binaries: points to (e.g. attacker got GitHub/GitLab credentials of upstream developer).
There is a source code release to match the bad APK, e.g. included and tagged in the git repo.
There is a fdroiddata Builds: entry to build the bad APK.
The bad APK would have to be reproducible.
Additionally, to be an effective attack, it would require that:
Target does not already have the app installed (otherwise updates would fail at the Android signature check).
No one who had the app installed noticed that the update failed due to Android signature check.
Upstream dev does not notice bad APKs get posted, the source code pushes, or the new tag.
At best, this vulnerability lets the attacker act as if AllowedAPKSigningKeys was not set. Then the attacker would still have to compromise other visible things to get anywhere.
This does not affect signature copying method used when fdroid signatures copies files into fdroiddata. This is the preferred method for including reproducible builds with the highest level of security, but it is more work to maintain than the Binaries: approach.
binary repos
If a binary repo maintainer is not careful about where they get their APKs and relies completely on AllowedAPKSigningKeys to verify the APKs, then this is an important issue. The verification practices of the repo maintainer are not externally visible, so there is an element of trusting the maintainer. Somethings can be visible, like whether the repo includes GPG signatures on the APKs.
From the beginning, https://guardianproject.info/fdroid/repo/ relied on GPG signatures on every APK it downloaded and included. Those GPG signatures are part of the repo so others can verify them as well. AllowedAPKSigningKeys: was added because it is easy to use, and then we can dogfood it also. And hopefully, we can get AllowedAPKSigningKeys: good enough that the GPG signatures are no longer relevant. APK v3 Signatures are looking pretty good these days.!<
Overall, this is a very dangerous scenario which is confirmed by the devs, but earns nothing more than a shoulder shrug. Furthermore, it shifts responsibility to 3rd party repos (who would've seen that coming!).
Just irresponsible behavior and completely in line with what the project account wrote above.
The solution as others keep mentioning seems simple. Don't use F-Droid or any client for it.
Use Obtainium and GrapheneOS recommended App Store apps: Accrescent, Obtainium with AppVerifier to check the hashes/checksums of dowloaded APK files. Google Play store app to access stuff with a Google account. And the pre-installed App Store that comes with GOS.
Aurora Store is not recommended.
- Edited
Basically avoid using or funding F-Droid (financially via donations). Eventually the F-Droid way of life will end (we can only hope).
- Edited
GrapheneOS One of the issues looks like not properly checking signatures for imported APKs where malicious APKs can be imported and end up included in the signed metadata.
Where and why would they import APKs? Aren't they building everything from source? If they are importing already pre-compiled APKs without verifying them through reproducing the builds, that is a pretty big issue in my threat model, even if there are no certificate pinning bypass.
GrapheneOS Simply building a similar kind of system following security best practices, not reusing app ids, not using a bunch of legacy software but still redoing the builds like a traditional Linux distribution be far better.
Yeah, that is what I tried to say.
GrapheneOS Why should GrapheneOS users trust people engaging in such underhanded tactics with the critical role of building and signing most of their apps?
Because, unfortunately, there are no alternatives.
missing-root do you agree that the "xz" situation can always happen if you blindly trust app devs without reproducible builds enforced and checked for each update?
Yes, and that is not even what I worry about the most.
App developers seldom has the expertise and capabilities to build their apps in secure environments, they just build them on their regular computer. Most of them probably just store the signing key there too, not protecting it using dedicated hardware at all. Who knows how many more than the app developers can modify or sign the releases. Individual app developers may also easily comply to sign malicious builds if persuaded to do so by an authority. I think it is fair to say it is mostly only developers of privacy apps, and cryptocurrency apps, that gets it right, and even then they have been hit by supply chain attacks repeatedly, while traditional Linux repositories have been able to withstand that better, even if only because of being somewhat slower moving. The xz backdoor failed, because they targeted those slow moving Linux distributions, but many have lost all their cryptocurrencies because of the wallet app they were using upgraded to a malicious dependency, without the developer knowing the dependency was compromised, and it was all deployed to end-users within days.
The ones who maintain an app repository is expected to be able to build apps in a secure environment, to maintain signing keys using dedicated hardware in a secure way, and to be able to act on reports of backdoors and security vulnerabilities even when the app developer isn't.
The whole situation here is because F-Droid isn't trusted to carry that role. But there literally are no other app repository that even attempts to do it. So there are no alternatives.
missing-root Having a security model like Accrescent, where app devs do whatever they want and the chain of trust uncontrolled starts on their desk, is problematic.
Yes, I have zero trust in Accrescent. They take the pre-compiled APKs from the developers, do not perform any verification like reproducible builds or anything, and have despite their very small number of apps already accepted one with a bundled affiliate link with a tracker, and seem to not care about that. (Organic Maps, by the way, F-Droid removes that tracker.)
xuid0 The solution as others keep mentioning seems simple. Don't use F-Droid or any client for it.
Unfortunately, there are no options for me.
xuid0 Use Obtainium
That would go from having minimal peer review to having no peer review at all. Absolutely not a suitable choice in my threat model. I need that peer review, my security very much relies on that.
xuid0 Google Play store app
They outright permit and promote apps with very privacy invasive and freedom restricting technologies. They very evidently do not care about privacy at all, so how could I trust any app installed from there. At best, it could be like using Obtainium.
xuid0 Eventually the F-Droid way of life will end (we can only hope).
Unless we have an alternative by then, it would probably be the point at which I have to stop using phones for security and privacy sensitive things.
ryrona couldn't have phrased it better
missing-root The xz backdoor was almost entirely in the Git source repository and could have been completely done there. It had part of it included in the source tarball at the end without it being reproducible from the source repository. However, they could have put it into the source repository and it almost certainly wouldn't have been spotted based on that, especially considering most of it was already there. They took a bigger risk of discovery by modifying the source tarball the way they did. It appears it happened the way it did because previously they didn't have the ability to make the source tarballs themselves so they were doing it at a source code level. They switched to an easier but more easily discovered approach once they got control over releases. Look at how much was included in the source repository and how they did the final part in the source tarball, and you can see they clearly could have done the rest in the source repository. They already had nearly all of the backdoor as public source code without discovery, just not the final additions to the test cases.
F-Droid blindly fetches and builds code from app developers. How do reproducible builds for a tiny portion of the apps help anything when the source code isn't actually being checked? It has no practical benefits aside from marketing. GrapheneOS has reproducible builds and someone is now actually checking it for each release, but how does that actually benefit anyone in practice?
- Edited
GrapheneOS How do reproducible builds for a tiny portion of the apps help anything when the source code isn't actually being checked? It has no practical benefits aside from marketing. GrapheneOS has reproducible builds and someone is now actually checking it for each release, but how does that actually benefit anyone in practice?
Because it is far easier to insert a backdoor in the compiled binary, where no one would ever be able to tell, than to insert it into the source code where it could be discovered and seen by others reading the code. The whole point with reproducible builds are to make backdoors visible. Sure, maybe no one are looking for them, but now they can. That has quite a lot of value, if nothing else as a deterring effect.