That would have eliminated their ability to fix bugs in the firmware, though. They literally did this last week with the "Error 53" patch.
There are tricks to allow this, but broadly no: what we're talking about here is Apple engineering a snoop-proof architecture that remains resistant when the attacker is Apple itself.
And that's just not going to happen in any practical way. Eventually, if the government wants to compel an backdoor in iOS encryption there will be a backdoor to iOS encryption. Arguing otherwise is just fooling ourselves.
And it's a silly issue anyway. If you want snoop-proof encryption on your personal device, install Linux, select "encrypt my drive", and memorize a secure pass phrase. Done. Relying on a third party hardware vendor to do it for you won't ever work.
Well, it would have to wipe the keys to update the firmware. Alternatively the update process could ask for the password/passcode.
I agree with you, but even if you install something open source you're still trusting the hardware, so at the moment there's basically no practical method of not trusting any hardware vendor at all. Obviously when you get all your hardware and software from the same vendor then it makes a move on the government's behalf much more practical for them.
> even if you install something open source you're still trusting the hardware
Not for the encryption. That's done in software. A seized linux laptop with an encrypted partition using a strong key is effectively snoop proof by the definition we're using here.
It's true that hardware could have other attack vectors: a key logger to intercept the pass phrase would be an obvious one. But again, that's just my point: Apple is in no privileged position here. If they get compelled to backdoor the iPhone then no amount of security architecture along the lines you posit is going to help us, because they can just be compelled to defeat it.
> Not for the encryption. That's done in software. A seized linux laptop with an encrypted partition using a strong key is effectively snoop proof by the definition we're using here.
You can't run the software without hardware, so it has to be trusted. Don't misunderstand me, this is obviously considerably more far fetched than the Apple attacking their own software/hardware combo. However, assuming (a big assumption) that we trust what Apple are telling us at the moment, a seized iphone with a strong password would currently be just as snoop proof. In fact, this includes the phone that has spawned this conversion.
They can be compelled to defeat their own security if you're accepting continued updates to your phone. Under the security architecture I've (loosely) described they can't attack it without the user accepting an update. Of course, you're totally correct in practice because you're most likely just going to have to trust Apple updates as they come out.
There are tricks to allow this, but broadly no: what we're talking about here is Apple engineering a snoop-proof architecture that remains resistant when the attacker is Apple itself.
And that's just not going to happen in any practical way. Eventually, if the government wants to compel an backdoor in iOS encryption there will be a backdoor to iOS encryption. Arguing otherwise is just fooling ourselves.
And it's a silly issue anyway. If you want snoop-proof encryption on your personal device, install Linux, select "encrypt my drive", and memorize a secure pass phrase. Done. Relying on a third party hardware vendor to do it for you won't ever work.