• Announcements
  • Claims made by forensics companies, their capabilities, and how GrapheneOS fares

[deleted]

I agree that it's conceivable that an AP exploit by itself exploit enables this. However, on iOS 15-15.8.x, A11-A13 has no IPR while on the same exact OS version, A14-A15 does. Perhaps it's more of an engineering constrain rather than a technical limitation.

As can be seen from the Android table, they have limited resources to deal with older generations. That also explains why they haven't developed an exploit for a several year old Titan M2 firmware version. They want something working against current and future versions, not only increasingly irrelevant past versions. If they only had to develop an exploit against the earliest Titan M2 firmware before major improvements happened it wouldn't be nearly as difficult.

[deleted] According to you, even without setting any additional database password on Molly, AFU extraction will not yield the plaintext database? https://github.com/mollyim/mollyim-android/wiki/Data-Encryption-At-Rest Didn't say anything about this. Possible for @Nuttso to clarify?

GrapheneOS It uses the StrongBox keystore and should be setting the key as requiring an unlocked device. You can check the code to see if it does that.

Molly doesn't enforce the unlock state for its keystore key. Otherwise Molly wouldn't be able to start in background when the phone is locked and wouldn't be able to show notifications. Same with Signal.

When the database isn't protected by a passphrase, Molly should be able to open its database, even if the device is locked

So it can't set up its keystore key with the authentication protection (aka require the unlock state).

A phone in AFU state, screen locked, app unlocked -> Molly and Signal are the same thing. There's one difference, that makes some exploits to work for Signal case, but not for Molly, because Molly enforces the StrongBox keystore.

They'd need the phone in AFU state and exploit some vulnerability, typically a LPE.

Perfect scenario is where an app does its encryption stuff once the user is authenticated (device is unlocked). But for messaging apps that need to show notifications on background, while device is locked, it's not that simple.

    • [deleted]

    • Edited

    GrapheneOS Signal doesn't do it on either Android or iOS, but Molly exists on Android which in addition to their app passphrase uses the StrongBox keystore with a key requiring authentication and an unlocked device. Apps could transparently use the keystore that way without an app passphrase too.

    That's incorrect regarding signal on iOS. Signal chat database on iOS uses NSFileProtectionComplete. Only FFS yields Signal database on iOS not AFU extraction.

    [[this is not correct, it doesn't keep data at rest after first unlock on either Android or iOS even though it could]

    File system: Signal database is encrypted. The encryption key is stored in the keychain with the highest protection class. The only way to extract Signal conversations requires extracting the file system images and decrypting the keychain.
    https://blog.elcomsoft.com/2020/04/forensic-guide-to-imessage-whatsapp-telegram-signal-and-skype-data-acquisition/

    GrapheneOS That's not correct. Android supports keeping data at rest while locked.

    GrapheneOS The info you're quoting isn't correct since it ignores the hardware keystore on Android. Not everything they claim is correct.

    I agree. They're only talking about most situations probably.

    GrapheneOS Android does the first part of key derivation in the OS via scrypt. It derives keys with a simple personalized hash approach for each purpose from the scrypt key derivation, which is then passed along to the different use cases. One of those is the Weaver feature implemented by the secure element. Another is the initial authentication of the Owner user with the secure element which authorizes signed updates with a newer version of secure element firmware. Another is unlocking the keystore.

    The Weaver token is how Android implements time-based throttling via the secure element. The secure element has no direct involvement in key derivation. It's not super high performance hardware and is a poor place to do that for protecting against attackers able to bypass all the hardware-based security features.

    The final key derivation is done in the Trusted Execution Environment (TEE) on the main SoC, which means in TrustZone. It uses an SoC cryptographic acceleration feature providing hardware-bound key derivation as one of the features. The hardware-bound aspect means that exploiting the TEE shouldn't provide any way either direct or indirect via side channels to get access to the key used in the hardware-bound key derivation algorithm. On Snapdragon, the Qualcomm Crypto Engine provides these features and is meant to be used in the TEE applet for this.

    Thank you for the thorough explanation. Could you point me to more resources on this to learn more about how Weaver token does time-based throttling or how hardware bound key derivation works in TEE for pixel?
    Apple has a compiled page on how all the key derivations works in different subsystems https://support.apple.com/guide/security/secure-enclave-sec59b0b31ff/1/web/1

    GrapheneOS The amount of time spent on hardware-bound key derivation varies by device and we don't know specifically how long it's configured to take in each Pixel generation.

    GrapheneOS The hardware-bound key derivation part is in the TEE and we can't change that firmware beyond requesting improvements as we've done successfully in several areas.

    I feel this is a critical last line of defense and this is the only timing delay enforced by cryptography instead of code integrity albeit on processors with smaller attack surface. The Supersonic BF on iPhone has almost the same speed as the 80ms theoretical hardware key derivation speed quoted by Apple. This means all other secure enclave based mitigations have been bypassed and the only line of defense against a truly unlimited speed brute force is this cryptography enforced iteration count. Perhaps @GrapheneOS team can ask Pixel team to publish this data and increase the timing delay to at least Apple's 80ms standard, which has negligible user impact

      • [deleted]

      • Edited

      Nuttso Molly doesn't enforce the unlock state for its keystore key.

      On iOS, Signal chat database on iOS uses NSFileProtectionComplete. Only FFS yields Signal database on iOS not AFU extraction.

      [this is not correct, it doesn't keep data at rest after first unlock on either Android or iOS even though it could]

      File system: Signal database is encrypted. The encryption key is stored in the keychain with the highest protection class. The only way to extract Signal conversations requires extracting the file system images and decrypting the keychain.
      https://blog.elcomsoft.com/2020/04/forensic-guide-to-imessage-whatsapp-telegram-signal-and-skype-data-acquisition/

      [this is not correct, it doesn't keep data at rest after first unlock on either Android or iOS even though it could]

      Nuttso Otherwise Molly wouldn't be able to start in background when the phone is locked and wouldn't be able to show notifications. Same with Signal.

      Why wouldn't notification work? The push token can be saved with NSFileProtectionCompleteUntilFirstUserAuthentication while the main database key is saved with NSFileProtectionComplete.

      To show a preview of the notification however, I speculate that some ephemeral private key can be saved temporarily with NSFileProtectionCompleteUntilFirstUserAuthentication and then saved to a separate pending database. But to only notify user a new message is received without any preview, almost everything can be secured with NSFileProtectionComplete.

      Nuttso So it can't set up its keystore key with the authentication protection (aka require the unlock state).

      But Signal on iOS works exactly like this. See above

      [this is not correct, it doesn't keep data at rest after first unlock on either Android or iOS even though it could]

        If a strong passphrase is used in the Owner profile, a 6 digit PIN is used in a secondary user profile, device is in BFU mode. Would the secondary user profile remain secure as time goes by and new exploits against the secure element and the OS are developed?

        • de0u replied to this.

          DeletedUser115 I do not recall where (I believe!) I saw this, but I think the situation is:

          1. At present the only way to use the phone to access the secondary profile is to unlock the owner profile and then switch to the secondary profile, but
          2. If somebody disassembles the device (etc.), the secondary profile's storage is encrypted with keys that are not derived from the owner profile's PIN/passphrase.

          So I think if it is desired for Profile X's storage to be highly secure then Profile X needs a long random passphrase.

          If I'm wrong, I'm sure I'll be corrected!

            de0u Thank you for your reply!

            Reading https://grapheneos.org/faq#encryption

            The owner profile is special and is used to store sensitive system-wide operating system data. This is why the owner profile needs to be logged in after a reboot before other user profiles can be used.

            I was hoping that some key material for secondary user profiles would be stored in the Owner profile and encrypted with the Owner passphrase. I guess that is not really the case, would be good to get an official confirmation from GrapheneOS team.

            If we assume that secondary profiles encryption key is based solely on 1) User lock method (6-digit PIN in my example) and 2) Secret in secure element that it won't release until supplied correct 1) or compromised. If this is how it works and no further secret from the Owner profile is needed, it does sound like a compromised secure element would lead to brute-forcing PIN and decryption of the secondary profile data. Now the question is whether device has to be disassembled to perform this attack?

            • de0u replied to this.

              DeletedUser115 Now the question is whether device has to be disassembled to perform this attack?

              I don't believe so (which is why I wrote "etc."). Compromising hardware security via an exploit is also theoretically possible.

              DeletedUser115 I was hoping that some key material for secondary user profiles would be stored in the Owner profile and encrypted with the Owner passphrase. I guess that is not really the case, would be good to get an official confirmation from GrapheneOS team.

              Ok, I went and searched harder.

              @DeletedUser115, I believe you received an official answer to this question, or to a very similar question, a year ago: https://discuss.grapheneos.org/d/5274-is-second-user-profile-encrypted-also-by-first-user/9 That is, I believe the answer received then to the two-part question you posed then works out to "Both #1 and #2 are false".

              If a question still remains in light of that answer, might it be possible to phrase it differently so it is clear which part(s) are not yet addressed?

                de0u I believe you received an official answer to this question, or to a very similar question

                Thank you for digging it up! I do remember that discussion, but the rationale given there included:

                While it is true that currently Owner has to be unlocked before attempts on secondary user profiles can be made, it isn't out of the question for AOSP to change that behavior in the future if they regard it as a limitation, which they likely do (from a UX standpoint and considering the fact that multiple users are meant to be used by individual people, having to have the owner present before you can unlock your own profile after a reboot isn't great).

                It's understood that it's safer not to rely on the Owner profile for protecting data on secondary profiles. But I am curious if Owner passphrase adds any additional protection to secondary profiles as of now, not taken into account any possible change in AOSP in the future.

                For context, the reason I am asking it is remembering a second (or more) random passphrases for secondary user profiles is a lot of cognitive load for an older lady like me lol.

                • de0u replied to this.

                  DeletedUser115 It's understood that it's safer not to rely on the Owner profile for protecting data on secondary profiles. But I am curious if Owner passphrase adds any additional protection to secondary profiles as of now, not taken into account any possible change in AOSP in the future.

                  Based on the previous official statement, I believe that as of now in order to unlock a secondary profile's data it is necessary to first unlock the owner profile or else to brute-force the secondary PIN/passphrase given an image of the storage, and/or to compromise some part of the hardware security.

                  But I think it's pretty clear that as of now a strong owner passphrase does not increase the strength of a weak secondary-profile PIN if one assumes a well-resourced attacker who has an exploit or is willing to disassemble a device. Thus I believe the answer to your core question is "no".

                  Some people might wish it were "yes" (e.g., you) and some people are glad it's "no" (people who hope someday there will be a way to boot straight to a non-admin profile, perhaps for a child). But it does appear that the present answer is "no".

                  DeletedUser115 For context, the reason I am asking it is remembering a second (or more) random passphrases for secondary user profiles is a lot of cognitive load for an older lady like me lol.

                  How necessary this might be depends on one's goals and threat model. If one has genuinely important data (perhaps multiple banking apps) in a secondary profile, a strong key may be necessary. How strong depends on one's presumed attackers and how much it's worth to keep the secondary profile secure. If the secondary profile has access to just one bank account containing one month's shopping money, auto-filled monthly by a different bank, maybe a medium-length PIN is enough.

                  DeletedUser115 Or, TL;dr: if one assumes the attacker is weak (restricted to treating the device as a black box and interacting with it via the regular login windows), as of now a strong owner passphrase probably provides substantial protection to the secondary profile BFU even if the secondary profile credential is weak. But if one assumes a well-resourced attacker, a strong owner passphrase provides close to zero added protection against a weak secondary-profile PIN.

                  Since you have mentioned scenarios involving the secure element being compromised, that sounds like the "well-resourced attacker" part of the space, thus I think the "close to zero added protection" answer applies.

                  Pixel6-8 in table3, AFU support says FFS YES as well as BF NO, does this table mean that even if I don't provide the unlock code to the law enforcement agency, if the device is in the locked state of AFU, they can get and analyse the complete system image through cellebrite and extract the application information from it?

                    DeletedUser115

                    According to the dev team, grapheneos is a modification built on top of AOSP, and if the original system can be breached, I don't think a modification built on top of the original system is safe either.

                      taiyi That simply isn't true. The entire purpose of GrapheneOS is to add security and privacy features and hardening on top of AOSP. It's the entire reason it exists.

                      This very thread contains proof that explains that Cellebrite has capabilities on stock, but doesn't have those same capabilities on GrapheneOS. Please take a look at the post again, particularly the attached images.

                        matchboxbananasynergy
                        I'm a Chinxxx, and my threats mode against goverment law enforcement. Thanks to GFW's help for Chinxxx people and the Chinxxx government's suppression of freedom of speech, Chinxxx have a lot of experience with security and anonymity in the online part of the world.
                        In this case, if my goal is to keep law enforcement from securing a local device if I refuse to provide the unlock code when they want to check the phone, which do you think is a better choice, GrapheneOS or iphone?