ryrona
Where and why would they import APKs? Aren't they building everything from source? If they are importing already pre-compiled APKs without verifying them through reproducing the builds, that is a pretty big issue in my threat model, even if there are no certificate pinning bypass.
They almost entirely build and sign the apps themselves. For a small portion of the apps, they use the developer signed APKs if they match what they build. This can go very wrong where updates get indefinitely delayed especially with them using a problematic and outdated build environment. There are prebuilt libraries either way. They do not review anything when fetching and building the code but rather do automated scanning including with antivirus. Fetching and building the code is automated. Signing is triggered manually for batches of code. It's almost entirely automated on a server other than signing, and they might automate signing too. The signing not being automated doesn't mean there's some kind of review process before it's signed.
Because, unfortunately, there are no alternatives.
There are a bunch of alternatives. F-Droid is hardly the only way to obtain builds of open source apps.
App developers seldom has the expertise and capabilities to build their apps in secure environments, they just build them on their regular computer. Most of them probably just store the signing key there too, not protecting it using dedicated hardware at all. Who knows how many more than the app developers can modify or sign the releases. Individual app developers may also easily comply to sign malicious builds if persuaded to do so by an authority. I think it is fair to say it is mostly only developers of privacy apps, and cryptocurrency apps, that gets it right, and even then they have been hit by supply chain attacks repeatedly, while traditional Linux repositories have been able to withstand that better, even if only because of being somewhat slower moving. The xz backdoor failed, because they targeted those slow moving Linux distributions, but many have lost all their cryptocurrencies because of the wallet app they were using upgraded to a malicious dependency, without the developer knowing the dependency was compromised, and it was all deployed to end-users within days.
The xz backdoor did not fail. It was shipped and then discovered by someone in production due to a completely unnecessary flaw in the backdoor. It shipped in the more bleeding edge variants of Debian and RHEL. It was only caught because it had a major performance issue they should have avoided. They also unnecessarily modified the release tarball in a way that wasn't why it was detected but could have been detected. They made nearly all their changes in the source repository but then unnecessarily bundled the final bits of the obfuscated payload in the tarball. It's a strange decision since it was well obfuscated as compressed data in tests. It looks a lot like they spent a lot of time doing things slowly and carefully but then rushed to finish it and blew their cover in doing that. It's entirely possible there were a bunch of servers compromised due to it based on crawling all the servers with sshd across the whole IPv4 space looking for it.
The ones who maintain an app repository is expected to be able to build apps in a secure environment, to maintain signing keys using dedicated hardware in a secure way, and to be able to act on reports of backdoors and security vulnerabilities even when the app developer isn't.
F-Droid is building on a server, not local infrastructure. That's much worse than a developer building on their workstation especially since it would result in the entire repository being compromised not only a single app. A developer's workstation being compromised means a backdoor can be put directly in the sources, and as the xz situation showed, the backdoor being present in the sources doesn't mean it will be found. The xz backdoor was largely present in the source repository and fully present in the source tarball. It was a source code level backdoor, not something included in binary releases from the project. You portray it as if that's not the case but it was, and it did ship in distributions including Debian. It didn't make it all the way to Debian stable because of how slowly it updates. If they had done a better job and not introduced a huge performance problem, it would not have been discovered when it was. It's unclear when it would have been discovered.
Yes, I have zero trust in Accrescent. They take the pre-compiled APKs from the developers, do not perform any verification like reproducible builds or anything, and have despite their very small number of apps already accepted one with a bundled affiliate link with a tracker, and seem to not care about that. (Organic Maps, by the way, F-Droid removes that tracker.)
That's not a tracker.
That would go from having minimal peer review to having no peer review at all. Absolutely not a suitable choice in my threat model. I need that peer review, my security very much relies on that.
F-Droid does not review the code. They automatically fetch and build it. They scan for non-open-source code and run it through antivirus. That's what they actually do, not what people believe they do.
Unless we have an alternative by then, it would probably be the point at which I have to stop using phones for security and privacy sensitive things.
You prefer using a system where a backdoor was shipped to users and only discovered because it had awful performance because you wrongly attribute the discovery of that backdoor to it. It doesn't make much sense. The reality is that the system you're promoting completely failed at stopping this and adds a huge number of trusted parties. Extremely delayed updates leaving people vulnerable to many unpatched security bugs is what protected people from getting it, not any kind of review or safety provided by thousands of packagers acting as additional trusted parties in Debian.
The xz situation in fact demonstrated that this packaging system even with reproducible builds will not stop developers shipping a backdoor to users via the source code built by these distributions / repositories. It was a backdoor in the source tarball and they shipped it to users. It was caught due to incredibly poor performance due to low quality of implementation for the backdoor, not these packaging systems.