First of all, I'm a moderator, not someone who makes these decisions, so keep that in mind. The things I say here are my thoughts and a developer with a better understanding of how things work might have a different and better take on some of the things I say.
Second of all, I know that the topic here is about police checking phones, but I'll avoid saying "police" here and instead say "adversaries". I don't know anything about the situation there, so I can't comment. I'm not going to make any assumptions about laws and whether anyone is breaking them or if certain police or other people's actions are legal or justified.
Anyway, I think there are a number of things wrong with the proposed approach. I'm just going to list some of the ones that come to mind:
- GrapheneOS is pretty well known, so it's possible adversaries would know about the OS and its features. Even if a majority don't know about GrapheneOS, enough of them might know about it, and the number of people who know about the feature may increase as time goes on, making the feature less and less effective.
- Pretend the feature did exist... Even if adversaries didn't know about the OS or the proposed feature, they could be trained or could be instructed to follow SOP written up by people who do know what they're doing. So, even if the adversaries were clueless, they could be instructed to check for certain signs of a decoy phone or profile. They could also be instructed to check for other profiles or other signs, like ADB not being available or not working.
- Even if things are a certain way now, if GrapheneOS or even other OSes start shipping a feature to show decoy profiles, adversaries may be trained to look for signs of them.
- In the situation described above, adversaries don't need to know about GrapheneOS to suspect decoys, for example, they may be prepared for people to have entire backup decoy phones, so even if they don't know about a specific OS's features, they can still on the lookout for decoys, identifying them by other means like whether there are recent photos in the photo album, check messaging apps with recent messaging history, etc.
- A decoy profile would have to be convincing. This means keeping it up to date with fake data.
- From a technical perspective, when a device is in BFU only the owner profile can be logged in (that's how it works now, but may change in the future). So this feature wouldn't even work if the device has been rebooted. So how could a decoy profile be loaded in this case?
I feel that the proposed decoy profile feature is only just good enough to work in situations where adversaries don't know what they're doing. It's not a very good feature if very basic training is all that it takes to defeat it. That would mean we'd be right back where we started and developers would have wasted a lot of time on something that ended up being ineffective. Not to mention adding ineffective features can harm the project's reputation, which can affect donations, etc. This is why these kinds of features have to be well thought out and planned carefully. This kind of feature would also have to be maintained into the future, so it's not just about the cost (I don't mean just the monetary cost) of implementing it in the first place, but there's also a cost to maintaining it as well. An "expensive" (not monetary) feature needs to be a really great one for it to be justified.
Even without the proposed feature, I don't see why people who want a decoy profile can't just use existing features. The owner profile can just as easily be your "decoy." You can use it as you normally would (solves the convincing data problem), then use a secondary profile for other more sensitive things and end the session when it's not needed. You can even set up multiple profiles, like 10 or something like that, give them all names and use the same PIN/password for some of them. If the adversaries find them and demand the PINs/passwords, you can unlock some of them, but pretend to not be able to unlock others saying you forgot. They still have multiple PINs/passwords to unlock the owner profile and possibly other profiles so they may be fine with that, so this setup may be just as effective in the situation described above as the suggested decoy profile feature.
Manole they are simply trained to believe that it's better to return to the base with a working device and the password.
So using the owner profile as your "decoy" may very well be good enough.
Manole thanks to VeraCrypt’s brilliant feature that allows this
I wrote out all the stuff above before seeing this and then suddenly remembered the project recently responded to The Shufflecake Project talking about decoy profiles a few days ago on Mastodon: https://grapheneos.social/@GrapheneOS/114919771819553165
Manole There should be an option implemented in GrapheneOS that allows for a secondary PIN, similar to the existing DURESS PIN feature. However, when this secondary PIN is entered, instead of wiping the phone, the system should display a fake system update screen.
Manole this process should take at least 10 hours to complete, then when 100% complete can be wiping.
If I understand the duress feature correctly, this wouldn't work. The fake update could be interrupted (just like a real update can be safely interrupted) which would interrupt the wiping process. Adversaries can know to or train others to forcefully power off updating devices to be safe.
I think a timed duress feature doesn't make much sense. Using real or "wall clock" time isn't a good idea, so it would have to use a monotonic timer from when the device starts up. So I think it would be a bad feature because it can be delayed simply by restarting. The existing duress feature doesn't have this problem.
Another problem is for this to work, it would have to have a very different UI/UX kind of setup than real updates, which would mean anyone who knows about the feature would know it was triggered and would know forcing a shutdown would stop the device from irreversibly erasing itself.
Maybe there could be ways around that like setting some flag, but then you may be in the position where they know you triggered a delayed duress feature, but the device isn't actually wiped. You even say in previous posts that if the device is updating they'd send it somewhere else to actual experts. The experts would know and that might lead to more and potentially much bigger problems.
Again, these are my thoughts. Someone else from the project might have a better and more informed take on the proposed feature.
Manole It wasn’t until they got back to their forensic lab that they realized it was a decoy profile. But at that point, they could no longer beat or intimidate me — because at their base, there are lawyers present, surveillance cameras, and procedures that prevent them from doing such things.
Don't want to say all the above without addressing this...
I don't want people to read these kinds of responses thinking that the project doesn't care about these kinds of situations. The project is very aware that if certain features are badly designed or implemented, people can be at increased risk. This is why the project doesn't add features unless they're done right. See the response I linked above. The official project account says profiles aren't a good fit for this kind of feature. Also, as I pointed out, a fake update screen + delayed duress wipe could be a very problematic feature.