They made it impossible for the new versions of the hardware, I think. (The Secure Enclave, in the phones that have it, would delete the decryption keys stored within it after receiving ten incorrect auth attempt requests from the CPU. The Secure Enclave cannot be updated—as far as we know—to be brute-force-able.)
It's just the old versions, before Apple started making privacy a political tentpole, that they could ever manage to backdoor in this particular way. That's still bad—but what Apple are mainly worrying about is that this precedent would force them to aid the FBI in other act-of-creation ways, not just in ways involving introducing backdoors.
The FBI might compel Apple to, for example, build a monitoring and clustering service into iTunes Genius/Apple Music to de-anonymize people through their music preferences; or they might request Apple build a facility to censor all messages travelling through APNS that mention the names of informants in active sting operations. Or they might compel Apple to build a secret CPU ring-0 elevation handshake into the A10 chip.
Even if the new generation of iOS devices was completely un-backdoor-able, and the usage of the old generation was effectively nil, that still wouldn't minimize the FBI's request. The FBI are likely much more concerned about all the orders they could give to, effectively, turn Apple's engineering talent into a domestic-surveillance consultancy.
Apple -- or any other tech company -- could probably prevail in court against a lot of the more outlandish FBI scenarios that you describe. Most appellate courts take privacy quite seriously. So does the Supreme Court, with both right- and left-leaning judges getting there just fine on their own. The Fourth Amendment is pretty clear on this.
The problem for Apple, or any other tech company, is that once you say "okay" once, you can't swat away the next one by saying "Impossible" and being done with it. You have to lawyer up. That takes time and costs money.
So Apple's real issue is that they don't want to spend many millions of dollars (billions?) on lawyers to get the FBI and other alphabet agencies to behave properly. That's a very legitimate business position. It's just a little more nuanced than what we were told on day one.
It's just the old versions, before Apple started making privacy a political tentpole, that they could ever manage to backdoor in this particular way. That's still bad—but what Apple are mainly worrying about is that this precedent would force them to aid the FBI in other act-of-creation ways, not just in ways involving introducing backdoors.
The FBI might compel Apple to, for example, build a monitoring and clustering service into iTunes Genius/Apple Music to de-anonymize people through their music preferences; or they might request Apple build a facility to censor all messages travelling through APNS that mention the names of informants in active sting operations. Or they might compel Apple to build a secret CPU ring-0 elevation handshake into the A10 chip.
Even if the new generation of iOS devices was completely un-backdoor-able, and the usage of the old generation was effectively nil, that still wouldn't minimize the FBI's request. The FBI are likely much more concerned about all the orders they could give to, effectively, turn Apple's engineering talent into a domestic-surveillance consultancy.