- Edited
Basically avoid using or funding F-Droid (financially via donations). Eventually the F-Droid way of life will end (we can only hope).
Basically avoid using or funding F-Droid (financially via donations). Eventually the F-Droid way of life will end (we can only hope).
GrapheneOS One of the issues looks like not properly checking signatures for imported APKs where malicious APKs can be imported and end up included in the signed metadata.
Where and why would they import APKs? Aren't they building everything from source? If they are importing already pre-compiled APKs without verifying them through reproducing the builds, that is a pretty big issue in my threat model, even if there are no certificate pinning bypass.
GrapheneOS Simply building a similar kind of system following security best practices, not reusing app ids, not using a bunch of legacy software but still redoing the builds like a traditional Linux distribution be far better.
Yeah, that is what I tried to say.
GrapheneOS Why should GrapheneOS users trust people engaging in such underhanded tactics with the critical role of building and signing most of their apps?
Because, unfortunately, there are no alternatives.
missing-root do you agree that the "xz" situation can always happen if you blindly trust app devs without reproducible builds enforced and checked for each update?
Yes, and that is not even what I worry about the most.
App developers seldom has the expertise and capabilities to build their apps in secure environments, they just build them on their regular computer. Most of them probably just store the signing key there too, not protecting it using dedicated hardware at all. Who knows how many more than the app developers can modify or sign the releases. Individual app developers may also easily comply to sign malicious builds if persuaded to do so by an authority. I think it is fair to say it is mostly only developers of privacy apps, and cryptocurrency apps, that gets it right, and even then they have been hit by supply chain attacks repeatedly, while traditional Linux repositories have been able to withstand that better, even if only because of being somewhat slower moving. The xz backdoor failed, because they targeted those slow moving Linux distributions, but many have lost all their cryptocurrencies because of the wallet app they were using upgraded to a malicious dependency, without the developer knowing the dependency was compromised, and it was all deployed to end-users within days.
The ones who maintain an app repository is expected to be able to build apps in a secure environment, to maintain signing keys using dedicated hardware in a secure way, and to be able to act on reports of backdoors and security vulnerabilities even when the app developer isn't.
The whole situation here is because F-Droid isn't trusted to carry that role. But there literally are no other app repository that even attempts to do it. So there are no alternatives.
missing-root Having a security model like Accrescent, where app devs do whatever they want and the chain of trust uncontrolled starts on their desk, is problematic.
Yes, I have zero trust in Accrescent. They take the pre-compiled APKs from the developers, do not perform any verification like reproducible builds or anything, and have despite their very small number of apps already accepted one with a bundled affiliate link with a tracker, and seem to not care about that. (Organic Maps, by the way, F-Droid removes that tracker.)
xuid0 The solution as others keep mentioning seems simple. Don't use F-Droid or any client for it.
Unfortunately, there are no options for me.
xuid0 Use Obtainium
That would go from having minimal peer review to having no peer review at all. Absolutely not a suitable choice in my threat model. I need that peer review, my security very much relies on that.
xuid0 Google Play store app
They outright permit and promote apps with very privacy invasive and freedom restricting technologies. They very evidently do not care about privacy at all, so how could I trust any app installed from there. At best, it could be like using Obtainium.
xuid0 Eventually the F-Droid way of life will end (we can only hope).
Unless we have an alternative by then, it would probably be the point at which I have to stop using phones for security and privacy sensitive things.
ryrona couldn't have phrased it better
missing-root The xz backdoor was almost entirely in the Git source repository and could have been completely done there. It had part of it included in the source tarball at the end without it being reproducible from the source repository. However, they could have put it into the source repository and it almost certainly wouldn't have been spotted based on that, especially considering most of it was already there. They took a bigger risk of discovery by modifying the source tarball the way they did. It appears it happened the way it did because previously they didn't have the ability to make the source tarballs themselves so they were doing it at a source code level. They switched to an easier but more easily discovered approach once they got control over releases. Look at how much was included in the source repository and how they did the final part in the source tarball, and you can see they clearly could have done the rest in the source repository. They already had nearly all of the backdoor as public source code without discovery, just not the final additions to the test cases.
F-Droid blindly fetches and builds code from app developers. How do reproducible builds for a tiny portion of the apps help anything when the source code isn't actually being checked? It has no practical benefits aside from marketing. GrapheneOS has reproducible builds and someone is now actually checking it for each release, but how does that actually benefit anyone in practice?
GrapheneOS How do reproducible builds for a tiny portion of the apps help anything when the source code isn't actually being checked? It has no practical benefits aside from marketing. GrapheneOS has reproducible builds and someone is now actually checking it for each release, but how does that actually benefit anyone in practice?
Because it is far easier to insert a backdoor in the compiled binary, where no one would ever be able to tell, than to insert it into the source code where it could be discovered and seen by others reading the code. The whole point with reproducible builds are to make backdoors visible. Sure, maybe no one are looking for them, but now they can. That has quite a lot of value, if nothing else as a deterring effect.
Where and why would they import APKs? Aren't they building everything from source? If they are importing already pre-compiled APKs without verifying them through reproducing the builds, that is a pretty big issue in my threat model, even if there are no certificate pinning bypass.
They almost entirely build and sign the apps themselves. For a small portion of the apps, they use the developer signed APKs if they match what they build. This can go very wrong where updates get indefinitely delayed especially with them using a problematic and outdated build environment. There are prebuilt libraries either way. They do not review anything when fetching and building the code but rather do automated scanning including with antivirus. Fetching and building the code is automated. Signing is triggered manually for batches of code. It's almost entirely automated on a server other than signing, and they might automate signing too. The signing not being automated doesn't mean there's some kind of review process before it's signed.
Because, unfortunately, there are no alternatives.
There are a bunch of alternatives. F-Droid is hardly the only way to obtain builds of open source apps.
App developers seldom has the expertise and capabilities to build their apps in secure environments, they just build them on their regular computer. Most of them probably just store the signing key there too, not protecting it using dedicated hardware at all. Who knows how many more than the app developers can modify or sign the releases. Individual app developers may also easily comply to sign malicious builds if persuaded to do so by an authority. I think it is fair to say it is mostly only developers of privacy apps, and cryptocurrency apps, that gets it right, and even then they have been hit by supply chain attacks repeatedly, while traditional Linux repositories have been able to withstand that better, even if only because of being somewhat slower moving. The xz backdoor failed, because they targeted those slow moving Linux distributions, but many have lost all their cryptocurrencies because of the wallet app they were using upgraded to a malicious dependency, without the developer knowing the dependency was compromised, and it was all deployed to end-users within days.
The xz backdoor did not fail. It was shipped and then discovered by someone in production due to a completely unnecessary flaw in the backdoor. It shipped in the more bleeding edge variants of Debian and RHEL. It was only caught because it had a major performance issue they should have avoided. They also unnecessarily modified the release tarball in a way that wasn't why it was detected but could have been detected. They made nearly all their changes in the source repository but then unnecessarily bundled the final bits of the obfuscated payload in the tarball. It's a strange decision since it was well obfuscated as compressed data in tests. It looks a lot like they spent a lot of time doing things slowly and carefully but then rushed to finish it and blew their cover in doing that. It's entirely possible there were a bunch of servers compromised due to it based on crawling all the servers with sshd across the whole IPv4 space looking for it.
The ones who maintain an app repository is expected to be able to build apps in a secure environment, to maintain signing keys using dedicated hardware in a secure way, and to be able to act on reports of backdoors and security vulnerabilities even when the app developer isn't.
F-Droid is building on a server, not local infrastructure. That's much worse than a developer building on their workstation especially since it would result in the entire repository being compromised not only a single app. A developer's workstation being compromised means a backdoor can be put directly in the sources, and as the xz situation showed, the backdoor being present in the sources doesn't mean it will be found. The xz backdoor was largely present in the source repository and fully present in the source tarball. It was a source code level backdoor, not something included in binary releases from the project. You portray it as if that's not the case but it was, and it did ship in distributions including Debian. It didn't make it all the way to Debian stable because of how slowly it updates. If they had done a better job and not introduced a huge performance problem, it would not have been discovered when it was. It's unclear when it would have been discovered.
Yes, I have zero trust in Accrescent. They take the pre-compiled APKs from the developers, do not perform any verification like reproducible builds or anything, and have despite their very small number of apps already accepted one with a bundled affiliate link with a tracker, and seem to not care about that. (Organic Maps, by the way, F-Droid removes that tracker.)
That's not a tracker.
That would go from having minimal peer review to having no peer review at all. Absolutely not a suitable choice in my threat model. I need that peer review, my security very much relies on that.
F-Droid does not review the code. They automatically fetch and build it. They scan for non-open-source code and run it through antivirus. That's what they actually do, not what people believe they do.
Unless we have an alternative by then, it would probably be the point at which I have to stop using phones for security and privacy sensitive things.
You prefer using a system where a backdoor was shipped to users and only discovered because it had awful performance because you wrongly attribute the discovery of that backdoor to it. It doesn't make much sense. The reality is that the system you're promoting completely failed at stopping this and adds a huge number of trusted parties. Extremely delayed updates leaving people vulnerable to many unpatched security bugs is what protected people from getting it, not any kind of review or safety provided by thousands of packagers acting as additional trusted parties in Debian.
The xz situation in fact demonstrated that this packaging system even with reproducible builds will not stop developers shipping a backdoor to users via the source code built by these distributions / repositories. It was a backdoor in the source tarball and they shipped it to users. It was caught due to incredibly poor performance due to low quality of implementation for the backdoor, not these packaging systems.
missing-root Too bad it's untrue.
GrapheneOS They almost entirely build and sign the apps themselves. For a small portion of the apps, they use the developer signed APKs if they match what they build. This can go very wrong where updates get indefinitely delayed especially with them using a problematic and outdated build environment. There are prebuilt libraries either way. They do not review anything when fetching and building the code but rather do automated scanning including with antivirus. Fetching and building the code is automated. Signing is triggered manually for batches of code. It's almost entirely automated on a server other than signing, and they might automate signing too. The signing not being automated doesn't mean there's some kind of review process before it's signed.
This did not answer my question. I wanted to know about the vulnerability this thread is about. What certificate pinning is it that is being bypassed?
GrapheneOS The xz situation in fact demonstrated that this packaging system even with reproducible builds will not stop developers shipping a backdoor to users via the source code built by these distributions / repositories.
Everything we are doing within the security community is raising the bar, the amount of effort an attacker need to spend to be able to compromise a user or project. No one will ever be totally immune to attacks. If you didn't have reproducible builds, you could easily insert a backdoor into GrapheneOS, and no one would ever be able to tell. But now you have reproducible builds. That means anyone who are diffing the changes from GrapheneOS release to release, would very likely discover your attempt to insert the backdoor, unless you have extremely advanced skills in how to write code that looks genuine and correct but in fact is malicious and has a backdoor. You would have to have the skills to fool the ones reading your code. That is a waay higher bar. Especially since the backdoor getting discovered would mean all trust in you and GrapheneOS immediately disappear, so you never get a second chance. Not even the malicious xz developer had those skills, it got discovered after all, and he was seemingly a very skilled state-employed hacker.
I think you are downplaying the value reproducible builds have. As far as I see it, it is one of the most important security advances we have had, right there alongside memory safe programming languages, and end-to-end encryption.
Not sure if this is whataboutism or some other wrong argument.
For sure it is an issue to use software made by other people.
In the end, if you have reproducible builds you can check the code and be safe. If you just take some release APKs, you can check the sources and still get malware.
ryrona
Having reproducible builds is nice. But I'm not using GOS or and other apps because it is reproducible. I'm using them because I trust the devs to write code with good will and build it correctly. I also trust Google Play to verify apps genuine and deliver apps to me in a secure way.
I think if your threat model requires reproducible builds, you should also check the entire source and trust yourself to be able to find any backdoors in it. It's not enough to just trust others to inspect source for you. it'd be better to have your own secure build environment and build it yourself
missing-root In practice, neither you or others are checking the sources. Even if you were checking the sources, finding an intentionally hidden vulnerability is unlikely. Serious vulnerabilities often last for widely used and widely reviewed projects like the Linux kernel for years or even decades. If accidental vulnerabilities can't be reliably spotted even after substantial review, auditing, etc. that doesn't bode well for the ability to find a backdoor.
The xz situation was brought up and that was not spotted in the source code after several rounds of them adding backdoor infrastructure to the Git repository. It wasn't spotted when they put the finishing touches in the source tarball for the release, but it's highly unlikely that would have been spotted if they'd pushed it to Git since the final touches were well disguised / hidden. It was only the overall set of changes which when put together triggered deobfuscating a payload and using it maliciously. Most of that was in the Git repository already before the final pieces were added. It's unclear why they took the risk of making a far more non-reproducible source tarball someone might have noticed differed from what got generated from the Git repository. It's an example of their lack of stealth and finesse despite the long term commitment to it. They also severely screwed up the performance and that's why it was discovered: unnecessarily causing huge spikes of CPU usage. That likely would have been spotted by others eventually. If they hadn't made those mistakes, there's a high chance it would have gone undiscovered for months or longer. Would it have made it to Debian stable? Probably not considering it has frozen packages for years and hasn't had a new release yet, but Debian stable is full of unpatched, known vulnerabilities in a lot of the packages, including things like web and mail servers which are remote-facing but typically don't classify all the little memory corruption bug fixes as security vulnerabilities with CVE assignments. Most projects don't seek out CVE assignments at all.
What is, if you only use F-Droid with the following repos?
-https://mobileapp.bitwarden.com/fdroid/repo
-https://releases.threema.ch/fdroid/repo
These repos are from the offical Websites.
Are they also not safe? Ist it worse or better to use these repos than the normal ones from F-Droid? How bad in comparison to Github?
xuid0 github is not better here. It adds an additional middleman you need to trust.
Also Obtainium has no methods of verifying packages, only the android package manager solves this, if the APKs are signed.
Using F-Droid repos will be more performant and the F-Droid client is also more minimal. Using official repos from the devs should eliminate all risks with F-Droid, apart from maybe the client being outdated, then you can still use F-Droid Basic.
Sorry I did not write earlier this but using Obtainium app with AppVerifier app is the recommended way to directly obtain APKs via GitHub. Then comparing the certificate hashes of the APK using AppVerifier against the known internal database.
We don't want to blindly trust the APK one downloads from GitHub without making effort to verify the certificate hashes of the APK. If it doesn't work i would be asking in the Matrix listed under Community:
https://github.com/soupslurpr/AppVerifier
F-Droid or using any F-Droid client is not recommended: https://privsec.dev/posts/android/f-droid-security-issues
Appverifier is only needed for the first install. If devs dont publish their certificate, does this even make sense?
That "F-Droid security issues" is only about the official repo afaik, so not useful.
First install? We should be checking the APK every time it is downloaded. That means first install & updates.
xuid0 We should be checking the APK every time it is downloaded. That means first install & updates.
The AOSP package manager which handles installation and updates of apps/apks pins the signature on install and then all updates must be signed with the same cert or they are rejected.