[deleted]
I'm talking specifically about iOS here. iOS in contrast to Android has NSFileProtectionComplete. " Shortly after the user locks a device (10 seconds, if the Require Password setting is Immediately), the decrypted class key is discarded, rendering all data in this class inaccessible until the user enters the passcode again or unlocks (logs in to) the device using Face ID or Touch ID." https://support.apple.com/guide/security/data-protection-classes-secb010e978a/web
Android currently provides this feature via the keystore API. It is possible for apps to keep data at rest while the device is locked, just like iOS, and if they use StrongBox it's based on the secure element. Whether the keystore keeps keys marked as requiring user authentication and an unlocked device at rest rather than just disallowing usage is a quality of implementation issue specific to devices which we don't know much about at this point.
Android is adding a similar data class for data that's at rest while locked to avoid apps needing to use another layer of encryption via the keystore as they currently do. This DOES NOT mean that more iOS apps keep data at rest while locked. Signal doesn't do it on either Android or iOS, but Molly exists on Android which in addition to their app passphrase uses the StrongBox keystore with a key requiring authentication and an unlocked device. Apps could transparently use the keystore that way without an app passphrase too.
What you're referring to is an AFU extraction. The difference between an AFU extraction and FFS extraction is that data protected by NSFileProtectionComplete such as keychain data is not available in AFU extraction on iOS. On Android, AFU and FFS are the same due to lack of NSFileProtectionComplete class keys and no secure element exploit is needed for AFU or FFS extraction.
That's not correct. Android supports keeping data at rest while locked. Those are not the same thing. Neither OS has much data kept at rest while locked in practice. iOS makes this easier for app developers but easier does not mean more apps actually use it. Molly and several Android TOTP apps are counterexamples to that assumption.
Furthermore, even with secure enclave code execution, the plaintext passcode shouldn't be recovered if Apple had implemented it correctly. Because only the passcode verifier value i.e hash of (passcode + salt) is saved by the Secure Storage Component. However, Cellebrite specifically claims they can get iPhone plaintext passcode, suggesting Secure Enclave RAM is not clearing iPhone passcode properly after user input. Please see Cellebrite's explanation of IPR IPR.jpg
The info you're quoting isn't correct since it ignores the hardware keystore on Android. Not everything they claim is correct.
Please forgive my ignorance about how key derivation works on Android. All iOS data protection key derivation happens in secure enclave on iOS. You're saying on Android, the first part of key derivation happens on AP, then on TEE in Titan M2?
Android does the first part of key derivation in the OS via scrypt. It derives keys with a simple personalized hash approach for each purpose from the scrypt key derivation, which is then passed along to the different use cases. One of those is the Weaver feature implemented by the secure element. Another is the initial authentication of the Owner user with the secure element which authorizes signed updates with a newer version of secure element firmware. Another is unlocking the keystore.
The Weaver token is how Android implements time-based throttling via the secure element. The secure element has no direct involvement in key derivation. It's not super high performance hardware and is a poor place to do that for protecting against attackers able to bypass all the hardware-based security features.
The final key derivation is done in the Trusted Execution Environment (TEE) on the main SoC, which means in TrustZone. It uses an SoC cryptographic acceleration feature providing hardware-bound key derivation as one of the features. The hardware-bound aspect means that exploiting the TEE shouldn't provide any way either direct or indirect via side channels to get access to the key used in the hardware-bound key derivation algorithm. On Snapdragon, the Qualcomm Crypto Engine provides these features and is meant to be used in the TEE applet for this.
On Pixels and most modern Snapdragon devices, the OS does not receive the decrypted disk encryption keys but rather handles for using them. This is called the wrapped key feature. This prevents a kernel information leak, etc. from leaking the encryption keys. They're also presumably not meant to be available via a memory dump. We haven't verified this but the information from XRY, Cellebrite, etc. appears to indicate that works properly. Otherwise, XRY would not have had to use a leftover hash or something like that in memory to brute force the lock method. They did not appear capable of doing this with GrapheneOS, but further work has been done on GrapheneOS to rule out other ways this could happen by running full compacting GC which zeroes the old heap for SystemUI / system_server when the screen is locked, not just the standard of doing it a bit after unlocking to wipe leftovers of the lock method, etc. We also made some other changes and began auditing this.
You're right. Cellebrite claimed they can do FFS on pixel 8. They must have at least gained code execution on AP on AFU mode and used AP to communicate with Titan M2 to decrypt data if not code execution on Titan M2 itself, correct?
FFS is when they have the user's lock method already, so all it has to involve is exploiting the device from ADB shell after enabling developer options with the lock method, enabling ADB and authorizing their access. It's not any kind of fancy exploitation.
Thank you informing me that Android is already implementing something similar
The Android hardware keystore is what to look into if you want to know more. There are 2 hardware keystores on Pixels: the traditional TEE (TrustZone) keystore which encrypts data and stores it in regular storage and the StrongBox keystore provided through the secure element since the Pixel 3. Pixel 2 had a secure element with insider attack protection (Owner authentication needed to update firmware via valid signed updates with newer version) and Weaver (covered in https://grapheneos.org/faq#encryption) but predates StrongBox being available.
According to you, even without setting any additional database password on Molly, AFU extraction will not yield the plaintext database? https://github.com/mollyim/mollyim-android/wiki/Data-Encryption-At-Rest Didn't say anything about this. Possible for @Nuttso to clarify?
It uses the StrongBox keystore and should be setting the key as requiring an unlocked device. You can check the code to see if it does that.
My question is, if secure enclave/Tian M2 is totally compromised with full code execution, and a brute force attack is performed on device, how long does it take to brute force a six-character alphanumeric passcode with lowercase letters and numbers?
On Android, this is based on the combination of the standard scrypt key derivation which can be offloaded elsewhere without an exploit and the final hardware-bound key derivation done within the TEE (TrustZone). There cannot be a specific answer for how long it takes and an attacker could extract the key from the hardware to offload the hardware-bound key derivation part. The amount of time spent on hardware-bound key derivation varies by device and we don't know specifically how long it's configured to take in each Pixel generation.
How long does it take for a pixel phone with standard os and graphene os respectively?
It's currently the same and we can only increase the scrypt part. The hardware-bound key derivation part is in the TEE and we can't change that firmware beyond requesting improvements as we've done successfully in several areas.
Keep in mind that the delay mentioned in FAQ reproduced below is all bypassed due to code execution on Titan M2
The FAQ explains that there's scrypt key derivation in the OS and then at the end there's hardware-bound key derivation in the TEE similar to the secure enclave key derivation on iOS. The details of the hardware-bound part vary by device and are drastically different on Snapdragon vs. Tensor along with evolving significantly over time on Snapdragon. We know more about how it works on Snapdragon than Tensor.