On iPhone 12-14 with iOS 17.5-17.5.1 does AFU mean that if it's in AFU mode they can unlock the phone?

    stopthefeds

    FFS refers to extracting data when they have the PIN/password to unlock it already. Having an unlocked device doesn't mean you can extract all of the data from it. Their tool requires unlocking, enabling developer options (requires authentication), enabling ADB, authorizing ADB access from their tool and then uses unintended functionality in ADB to extract all of the data. FFS is their baseline capability which generally works everywhere as long as the user doesn't have a device admin app disabling ADB. They could far more easily exploit the OS after unlocking it but those aren't the kinds of exploits done by the Cellebrite Premium tool and would require something else.

    BF refers to brute force attacks after they've exploited the devices, which implies also exploiting the secure element on devices providing throttling with one.

    stopthefeds AFU means they can exploit the device in AFU and obtain nearly all the data from it even without the BF capability. They have a capability not listed in this table called IPR which was listed for previous versions of the table which enables them to obtain the PIN/password for iOS when they have the AFU capability. It's possible they support this on each version with AFU support now so they stopped listing it. The details of IPR are unclear and it seems it only works when the memory hasn't been reused for something else. It's definitely not supposed to be possible but iOS was or still is making a mistake enabling it. It may be fixed by now.

      stopthefeds Does closing a user profile put that profile back into BFU? Or it stays unencrypted?

      Closing a secondary user profile unloads the encryption key, making all its data inaccessible, so is in some sense equivalent to BFU. But traces from the activity in that user profile may still exist, such as in system wide log files. Rebooting the device is the only way to be sure. Rebooting is also the only way to unload the encryption key for the owner profile or private space.

      BFU (before first unlock) refers to a device that is powered off, or that has been powered on without any valid credentials yet provided at all. Rebooting or powering off the device is the only way to return it to BFU state.

      stopthefeds It's back at rest in terms of the disk encryption. However, data can persist in memory in parts of the kernel, SystemUI, system_server and other system-wide software such as driver services. Our zero-on-free implementation in the kernel and userspace helps clear data as soon as possible but it won't clear something that's not released. @ryrona mentioned the in-memory log buffer which is one example. No sensitive data is supposed to be logged there but not all apps will respect that and the interpretation of what that means varies throughout the OS and apps. It's not the only example, just one of them. If you want to get back to a clean BFU state, you inherently need to reboot.

      GrapheneOS If you were to recommend a phone that can resist electronic forensics, would you recommend an iPhone or a Graphene OS device? Does Graphene OS have any advantages over iOS in resisting electronic forensics?

        I would recommend GrapheneOS - it is much more private (see: https://arstechnica.com/tech-policy/2025/01/apple-agrees-to-pay-95m-delete-private-conversations-siri-recorded/), autoreboot feature is much more customizable (and has it way before iPhone), and has also some other anti-forensic features.

        On the other hand - Apple has more aggresive and more deceptive marketing, which seems pretty important for some people... :)

        Zw10704 GrapheneOS is clearly doing better than iOS for this as shown by the data above and other information. Why would we recommend an OS being consistently successfully exploited over GrapheneOS which is consistently resisting it? Look at the information in the thread.

          Zw10704 Why Samsung's Knox cannot resist Cellebrite forensics?

          I doubt Cellbrite wants people (including Samsung) to know. If Samsung knew, they would probably fix it. It's unlikely anybody here knows exactly why.

          Zw10704 Knox isn't a specific technical thing but rather Samsung's branding for a bunch of standard security features and a small number of low impact Samsung-specific features. Pixels have much better overall hardware, firmware and software security than Samsung devices along with supporting using GrapheneOS for a massive upgrade to privacy and security. Despite us not making the hardware, GrapheneOS uses hardware-based security features not used by either Galaxy or Pixel devices in addition to all the standard hardware-based security features used by the stock Pixel OS. Examples are hardware memory tagging in hardened_malloc and Vanadium for detecting memory corruption in production instead of only for development, pointer authentication codes in userspace rather than only the kernel, branch target identification throughout the kernel and userspace to cover what type-based CFI doesn't, hardware-based USB-C port control, etc. These examples show how GrapheneOS is not only a software-based security project but is also leveraging hardware security much more. Recommend reading through https://grapheneos.org/features, just bear in mind some recent features like 2-factor fingerprint unlock aren't included there yet and the amount of coverage on the page is not directly connected to the importance/impact.

          4 days later
          • Edited

          In the early 2010's it was my understanding, through employee conversations, that both Cellebrite and XRY were assisted by handset manufacturers. This may have been merely port drivers, proprietary specs, or a plethora of other possibilities.

          During this same time AT&T stores used Cellebrite devices for transferring data between a customer's old and new devices (typically in a back office).

          Relevant today? Likely not, but still interesting.

          11 days later

          Hi I am just curious to see where we are at with the security of iOS 18.2.1

            Hello, I'm interested in this question, does it turn out that pixel 7 is no longer as safe as pixel 8 and newer before forensic tools?

            Why? I mean, what are new information that gives you that thoughts?

              Matthai
              I decided to ask because I saw your answer. «Pixel 8 and Pixel 9 both have Memory Tagging Extension, because they are running Arm v9 CPUs.

              Arm Memory Tagging Extension (MTE) was introduced in Arm v9, and is a hardware feature in CPUs designed to improve software security by detecting memory-related vulnerabilities.

              MTE helps catch two common memory vulnerabilities - Use-After-Free vulnerabilities, when a program tries to use memory that has already been freed, and buffer overflow vulnerabilities, when a program writes more data than allocated to a memory block.

              Memory safety has been a major source of security vulnerabilities for decades. Studies suggest that over 75 percent of vulnerabilities in Android are violations of memory safety.

              So I guess if you buy Pixel 8 or 9, you will be pretty secure with GrapheneOS." And I do not understand what this means for me as for the owner of Pixel 7. That the phone is no longer so safe against forensics?

              • de0u replied to this.

                nameuser856 I do not understand what this means for me as for the owner of Pixel 7. That the phone is no longer so safe against forensics?

                The Pixel 7 is still as safe as it was against any particular kind of attack. But for some attacks the Pixel 8 and 9 are safer than the Pixel 7.

                So people making a purchase decision now might wish to pay more for a newer device, and people very concerned about security might wish to upgrade to get better coverage.