Hey everyone, I've been doing some light research both here (grapheneos.org guides) and the AOSP website (source.android.com). I was wondering if there's any kind of debug-level endpoint that can be examined to track the use of keys, see which apps/utilities use which keys, and get a better look at how the Android operating system uses it's keystore.

Since my journey began as a Pixel user (and with Graphene, which I love) it seems as though the documentation is scarce concerning the precise Android security model. It also seems as though (by default) an Android device doesn't have access to it's own keys - applications and device builds are usually signed by the manufacturer or OS distributor, with the user having no way of gaining a higher level control over signed files. As a Linux user, I want to have the agency to be stupid with my own keys.

I understand that signing with my own keys makes the build incompatible with OTAs and destroys a large part of the security model. But for the hardcore Linux features I want, I would argue that Graphene OS is still by far the best option for me and other users could who would be using a root setup anyways if Graphene didn't exist. I truly believe Graphene is an everyman's OS supporting multiple use-cases in one way or another. Even power users/devs who like to mess around with their device could benefit (i.e. Graphene with root is still unsafe, but still infinitely safer than Lineage with root due to the additional sand-boxing and memory protection). Even just being able to toggle network usage for different apps makes a huge difference in security, not to mention the other features Graphene has built-in.

At the end of the day, the keystore runs the show, getting to decide which apps should be granted which permissions and for which purposes. I'm trying to investigate how these keys are used during runtime for my own custom build that I use for personal development and writing privileged bash scripts. From what I can tell reading AOSP documentation the keys are necessary for certain vendor features that Linux users may find attractive, such as DLKM. I'd rather have my own keychain for features like this, which would also opens the door for other development possibilities such as APEX packages as a hobby/side project. These normally aren't possible because in 99% of Android devices somebody else owns your private keychain. Even the ones that are documented (media, network stack, testkey, etc.) are just a few keys inside the build system. In reality, I found a metric fuckload (hundreds of undocumented private keys) in the AOSP repo by using a grep command. How can an Android user have any idea what the chain of trust looks like when everything is so obfuscated and poorly documented?

Does anyone know how to track keys? It goes without saying that custom Graphene OS builds don't fall in-line with the standard security model, but it still seems high above any other OS for my purposes. I'd still be using Cyanogenmod if it wasn't for Graphene! The Graphene OS team has done amazing things for the Android community as a whole, which includes many types of users.

In theory, it should be possible for people to sign their own builds and still be able to use attestation, like a "poor man's ro.secure." Of course, this can only be guaranteed to the extent that you aren't doing things that change the signatures or cause a mismatch during attestation (which happens to be a lot of root-related things). Ultimately though, validation is still accomplished by signature checking certain hashes - and even a rooted phone maintains it's own set of signatures. Having keys that are not centralized or widely distributed seems like it could be a somewhat valid (but ultimately less secure) approach to security for obscure builds running on only a handful of devices. I'm curious to find out if Graphene's Auditor app could be used for peer-to-peer attestation in this case - where keys are not widely distributed or known by a centralized party. To be honest, it seems kind of crazy to me that people blindly trust Google with their private keys and see this as completely normal..

  • de0u replied to this.

    fosh19991 I was wondering if there's any kind of debug-level endpoint that can be examined to track the use of keys, see which apps/utilities use which keys, and get a better look at how the Android operating system uses it's keystore.

    I'm not sure "key store" is being used here in the sense that Android uses it (example).

    fosh19991 Since my journey began as a Pixel user (and with Graphene, which I love) it seems as though the documentation is scarce concerning the precise Android security model.

    The system is very complicated. It is possible that parts of it aren't fully documented. That said, it's not clear that documentation is scarce. What has been consulted so far?

    fosh19991 It also seems as though (by default) an Android device doesn't have access to [its] own keys - applications and device builds are usually signed by the manufacturer or OS distributor, with the user having no way of gaining a higher level control over signed files.

    An important part of the Android security model is that system images are signed by a key for which the private half is not present on the device. That is, the system is explicitly designed so that builds happen off-device and are signed by keys that are very carefully protected. This limits the ability of malware compromising a device to hide itself. In this way (and others) the Android model is arguably superior to the standard Linux model.

    fosh19991 I understand that signing with my own keys makes the build incompatible with OTAs and destroys a large part of the security model.

    Not really. If you sign your own build and then later generate an OTA image signed by the same key, the security model would be 100% intact.

    fosh19991 At the end of the day, the keystore runs the show, getting to decide which apps should be granted which permissions and for which purposes.

    The key store mostly signs things, or verifies signatures, or encrypts things, or boxes keys up for transfer to other parties. By explicit design, the most decisions are made outside of the key store.

    fosh19991 In reality, I found a metric fuckload (hundreds of undocumented private keys) in the AOSP repo by using a grep command.

    Exactly what is meant by this is unclear. Providing a few examples might shed some light.

    fosh19991 o be honest, it seems kind of crazy to me that people blindly trust Google with their private keys and see this as completely normal.

    Exactly what is meant by this is unclear. Providing a few examples might shed some light.

      de0u de0u Great response, very informative. Here's what I found, and why I wrote this post to begin with (I want to emphasize that I'm still learning about Android security, but I do have a little experience with security practices after using things like PGP for other purposes over the years).

      The system is very complicated. It is possible that parts of it aren't fully documented. That said, it's not clear that documentation is scarce. What has been consulted so far?

      I've mainly been using build guides and sources like the one at source.android.com you linked above. I've found that grapheneos.org and source.android.com are perhaps the only reliable sources of information, and other sources seem to reference these (i.e. stackoverflow). Lately I've been doing most of my reading at source.android.com. The documentation is good, but examples are scarce and some things aren't elaborated on. AI will honestly just go tell you to break your phone if you want to talk to it about low-level Android development and build signing.

      An important part of the Android security model is that system images are signed by a key for which the private half is not present on the device. That is, the system is explicitly designed so that builds happen off-device and are signed by keys that are very carefully protected. This limits the ability of malware compromising a device to hide itself. In this way (and others) the Android model is arguably superior to the standard Linux model.

      This makes a lot of sense. Keeping the keys off-device is no doubt more secure than keeping them onboard. It definitely makes sense to not expose these keys whenever possible. Depending on your use-cases, it can be useful to have your own set though, as you mentioned here.

      Not really. If you sign your own build and then later generate an OTA image signed by the same key, the security model would be 100% intact.

      The only way to know what goes into your OS to sign updates yourself. I understand that this isn't for everyone, but it allows you take more ownership over your software and makes you responsible for examining and shipping updates yourself. This ensures that a malicious actor (whether it is Google, Samsung, or anyone else with the keys) cannot overwrite signed apps or partitions on the device without your consent. Maintaining your own keychain is also the only way to gain control of signing packages yourself, which can be useful for security in it's own right. The less widely distributed your signing keys are, the more difficult your device is to compromise. I'd argue that an ideal security model requires not trusting any third parties at all.

      As Graphene OS users, many of started using it because we didn't trust Google, who can push OTA updates whenever they want, which sometimes include spyware. A lot of what Google does is not outright malicious, but it can harm the privacy of end-users. I do some freelance work as an AI trainer for the companies sometimes, and I've come to realize that all of these companies are basically just the private branch of the surveillance state. A lot of people don't understand how valuable their data is; giving it away to Google for free is a bad move imo. I can elaborate on this, but I'm sure that many people on this forum already share the same consensus.

      In reality, I found a metric fuckload (hundreds of undocumented private keys) in the AOSP repo by using a grep command.

      Run this command in the Android source repo:

      ls -R | grep pk8

      It should display hundreds of private keys. It's possible to get an idea of what some of these keys are for based on names and context (such as inbuilt APEX packages, system utilities, etc.). But for others, the name doesn't give very much information. Even then, the name means nothing without seeing the underlying code yourself. It isn't exactly clear what these apps and keys do, but a lot of them have elevated permissions just by virtue of the package type. And since we didn't sign them, we can't tell any of these apps what to do. It might be possible to resign them or maybe by compiling Android with fake keys (I haven't tried this yet, but assuming they get signed at build-time it should work).

      To be honest, it seems kind of crazy to me that people blindly trust Google with their private keys and see this as completely normal.

      Linux has always allowed users to become their own root of trust - perhaps I'm old-fashioned, but I don't know why we so willingly gave this up on Android and other mobile OSes. Furthermore, I'm utterly terrified at what these companies are doing with our data. Under the harmless premise of "improving Microsoft/Google/OpenAI products by collecting usage data" they're taking as much as they can without sharing any details. Google docs, Microsoft Teams, and daily computer usage on Windows all send these companies data for free - rarely anything is private anymore. I know first hand that these companies pay millions of dollars for relatively small/insignificant datasets compared to what they get for free because we basically agree to give it to them. Meta already got in trouble for torrenting books and the other ones have had similar issues. The individual has a lot less power compared to publishing companies (i.e. books, software, etc.), so it's even easier for your data to be stolen without any recourse. We don't ask, they don't tell - all we know is that they're "improving products".

      • de0u replied to this.

        fosh19991 Run this command in the Android source repo:

        ls -R | grep pk8

        But a lot of those files are things like external/mbedtls/tests/data_files/ec_prv.pk8.pw.der (a key used by a test). And some of them are things like out/soong/.intermediates/external/boringssl/libcrypto/android_x86_64_shared_apex31/obj/external/boringssl/src/crypto/pem/pem_pk8.o (which is code to parse a key file, not any kind of key).

        This system is complicated. Any attempt to quickly approximate what is going on has a fair chance of measuring the wrong thing.

        fosh19991 Linux has always allowed users to become their own root of trust - perhaps I'm old-fashioned, but I don't know why we so willingly gave this up on Android and other mobile OSes.

        It is true that GrapheneOS inherits from AOSP the use of a Linux kernel. But thinking about GrapheneOS as some weird restrictive kind of Linux is likely to be a misunderstanding factory. Android does lots of things differently from traditional Linux because traditional Linux runs based on ambient authority, with little sandboxing, and has very little resistance to malware persistence. Many of the differences between "regular Linux" and Android are because Android is doing a better job at security. This may be of interest: "Linux" at "madaidan's insecurities".

        fosh19991 I'd argue that an ideal security model requires not trusting any third parties at all.

        If by "ideal" you mean "not possible", I agree.

        I think it would be very very very hard at present to avoid trusting whoever built one's compiler.

          de0u

          "Linux" at "madaidan's insecurities"](https://madaidans-insecurities.github.io/linux.html).

          This was a good read, if not a bit horrifying. To think that the first time I saw Android changing my UID from 0 on a per-app basis that I was just mad Google tried to take away my sudo privileges lol...

          I don't consider Graphene to be a weird restrictive version of Linux, but the fact that many pre-installed apps on any AOSP distribution have more privileges than me, the device owner, still leaves more to be desired. For many people a phone is a phone, so this doesn't matter. Out of all the OSes out there, Graphene is still my favorite.

          I think I would probably trust GCC for no other reason than the fact that it's older than me, but I see what you're saying. Backdoors and vulnerabilities are everywhere. The article helped me better understand why things are the way they are on Android, and I do appreciate it's hardened security a bit more.

          That said, it seems like signing your own build is still probably still better than not doing so. Doing so is less user-friendly and potentially "less secure" depending on what you're doing with the build, but doesn't entirely eliminate the security features that are unique to Android. For example, per-app UIDs address some of the concerns with sandboxing in Linux, unless you allow an app root access with a (preferably native) su binary. These are things I think would still be acceptable from a security standpoint (if you "trust" the app and are willing to deal with consequences). Occasionally using 'su' is definitely more secure than root on raw Linux citing the problems mentioned above, but is obviously much less secure than raw Graphene. Nonetheless, I should be able to make that choice as a user.

          This isn't something unique to Graphene or AOSP as a whole - every ROM does it. Most users don't care about root and would never need it; it's probably for their own good. After all, Android was made for normal people to operate a phone, not Linux enthusiasts. Personally, I'd still like to have a little bit more control over my setup. Some parts of Android is like they took Linux and sucked all of the fun out of it, even if justified. I think it would be cool to have a Linux phone, but thats just me! And I'll be the first to admit that this weakens the security model.

          I'll probably continue to experiment with building when I have time, at my own risk. It'd still be nice to get away from third parties (within reason) for the reasons discussed earlier, especially since this would still only add to security in theory. Perhaps I can find a sweet spot between security and usability.

          • de0u replied to this.

            fosh19991 The fact that many pre-installed apps on any AOSP distribution have more privileges than me, the device owner, still leaves more to be desired.

            The person tapping on the screen may not be the same person as the device owner. This is particularly true of devices issued by companies to employees for specific purposes.

            Even when the person tapping on the screen is the same person as the legal owner of the device, Android is designed on the assumption that 99% of device owners have no idea whether or not it's safe to agree to giving some app some permission. Over time Google is making it harder and harder to grant apps specific permissions that have been abused by malware precisely because thousands of people have in fact granted accessibility permissions, or SMS permissions, to malware.

            One of the great things Google has done with the Pixel line is enabling people to install custom operating systems, and also to sign those custom builds, and further for all of the security features of the hardware to work for custom OS builds. While there are other phone platforms that somewhat achieve that, there are other hurdles, such as good hardware security, reasonable firmware support lifespan, and fast patching of firmware bugs.

            Because of the environment Google has provided, you have the option of doing your own build and signing it. And as a result of that, it's not true that apps on your device have more privileges than you do. By building your own OS you enter the realm of "If you break it, you get to keep both halves".

            fosh19991 Occasionally using 'su' is definitely more secure than root on raw Linux citing the problems mentioned above, but is obviously much less secure than raw Graphene. Nonetheless, I should be able to make that choice as a user.

            Anybody who builds GrapheneOS does get to make that choice. And in theory if somebody wanted to fork the OS (including changing the name!) and do builds and make them available to users and then support the users, that somebody could make it convenient for the person tapping on each screen to disable security.

            The GrapheneOS developers choose not to support a community of users with security disabled, but nobody is forced to run GrapheneOS.

            Please note that I do not speak for the GrapheneOS project.

              de0u Understood. I don't plan on distributing my build, nobody would probably want it anyways lol. But I'm hoping to have fun exploring Android, perhaps with some CLI shenanigans on the side 😅

              No doubt, the documentation and support that Graphene OS provides is top notch. It's totally understandable that they don't directly support unofficial builds, especially ones that can break your phone, but the extensive docs they provide leave the doors open for users to choose their own experience - they tell you how to do almost everything and are very educational. It's great for the community and made Android fun for me again.

              Thanks for the discussion, this definitely made some things more clear for me. Hopefully I don't break my phone having too much fun, but I can't say the docs didn't warn me! Hopefully Graphene is around for many years to come.