A judge in the US has ordered Apple to provide ‘technical assistance’ to FBI, in creating what some (but not all) cybersecurity experts call a backdoor. In the few years I’ve written about these issues, I’ve never seen anything as hotly debated as this one, across the folks from digital security to foreign policy all coming down on both sides of the debate.
On one hand it seems a bit snarky of the FBI to use this one particular case, that looks to have the highest possible chance of success to set precedent, but on the other hand it seems mighty nasty of Apple to refuse to comply with a court order, to crack into a terrorist phone.
So here’s some facts of the case.
The phone in question belonged to Syed Rizwan Farook, a shooter in the San Bernadino shooting, which caused the deaths of 14 people. America has numerous mass shootings, but this one involved two Muslims aligned to ISIS–and hence more easily labeled terrorism, without the need for adjectives like ‘domestic’.
As I blogged about last week, self-radicalized terrorist don’t get funding from headquarters, and without that glorious ISIS-oil money, all these guys could afford for was an iPhone 5C, an entry-level phone with hardware identical to that of the iPhone 5, a phone launched waaaayy back in 2012 (you’ll remember that as the year Manchester United last won the Premier League). As an older phone, the security architecture of the 5C lagged behind the current generation iPhones, all of which have a secure enclave, but make no mistake, it’s still pretty secure.
By pretty secure, I mean that the phone has all of its contents encrypted, and un-readable to anyone without the encryption key. The key is derived from both the user passcode, and a randomly generated hardware key that is unique to the specific iPhone. It is generally understood that Apple doesn’t keep track of the hardware key, and therefore unable to provide it, as you might expect the hardware will also never give up it’s key under any circumstance. Without the hardware key, the encrypted data is unreadable, even with the passcode. Which explains why the FBI can’t suck the data out of the device for decryption on a more powerful computer, or load the data into 1000’s of iPhones for parallel cracking.
But even with the phone, things remain tough. To decrypt the phone, both the hardware AND the passcode is required. And if you don’t know the passcode, things don’t look good. iPhones are hardened against passcode guessing, just like ATM cards. If a thief managed to obtain your ATM card, they could take it to any ATM and try to guess your PIN Code, but they only get 3 attempts before the ATM sucks in the card, and ends the attack. The odds of guessing a 6-digit passcode with just 3 attempts is worse than striking the lottery–that’s why losing your ATM card isn’t that big a deal, and from a strictly data perspective, losing your iPhone isn’t a big deal either–provided you secured it with a long enough passcode. (of course no one likes losing a RM3000 phone)
Similarly, iOS works to limit an attackers ability to guess its passcode, either by slowing-down the rate of guesses (through artificially delaying retry attempts, sometimes by hours), or by just erasing the entire contents of the phone after 10 incorrect passcode entries. The latter being a feature the user has to actively switch on.
All these protections against brute-force passcode guessing are baked directly into iOS, Apple’s iPhone operating system, and even though this was an older generation phone it still had the latest software (yet another reason to admire Apple). Which also means that previous hacks which bypassed the protections have also been patched. Like I said…pretty secure.
To bypass these obstacles, the FBI is asking Apple to provide them with a special tailor made version of iOS that would eliminate these protections, specifically:
- The removal of the Auto-Erasing feature, which would otherwise erase the contents of the phone after 10 incorrect passcode entries.
- The removal of the artificial delays in attempting the passcodes, allowing the FBI to try passcodes at the fastest rate the hardware will allow.
- The ability to submit passcodes electronically to the iPhone which eliminates the need of a human person manually entering the passcodes by hand
- And, the special software could be further customized to work on only this one iPhone, down to the serial number etc..
In essence, the FBI is asking Apple to create a ‘special’ ATM, that would allow them to try PIN codes for a specific ATM card without sucking in the card after 3 failed attempts. That ‘special’ ATM would also have a ability to attempt the PINs electronically rather than having someone manually enter the PINs.
If a terrorist told you that the co-ordinates for a bomb was the PIN to his ATM card, and neither the Bank nor the Police had anyway to ascertain the PIN, other than creating that ‘special’ ATM–do you think the Bank should do it? Would it be considered a ‘back-door’?
All in all, it seems like a pretty reasonable request, but what makes this so controversial (at least in cyber security circles) is whether the 4 items above are considered a ‘back-door’. Some experts say it is, others say it isn’t.
What isn’t controversial is Apple’s ability to do this. Experts all agree that this is possible, and even Tim Cook’s brilliant PR response didn’t deny their ability to do so. So it is whether Apple ‘should’ do this, that is the question.
‘Should’ is a strange word, this is after all a phone that belonged to a terrorist, and shouldn’t Apple do everything in its power to help law enforcement? What if the phone has contact details of other ISIS operatives in the US, or what if it had more details of ISIS operations in Syria, wouldn’t we want that, regardless of how ‘burdensome’ it might be to Apple. This has nothing to do with the 4th amendment, after all the owner of the iPhone is dead, and a court warrant is available–so what’s stopping Apple? [update: Actually the ‘owner’ of the iPhone was the San Bernadino county, which meant it belonged to a government, not the individual, definitively ending any 4th amendment protections]
To be fair, there’s only a very small chance that these ‘self-radicalized’ lone wolves have anything of grave importance on their old iPhone, but when words like terrorism are bandied about, a 1% is more than enough.
Which is why I feel it’s a bit snarky for the FBI to use this case, it seems the perfect case. It involves terrorism (which pushes everyone’s emotional buttons), an old iPhone that can be cracked and violates no constitutional protections. But you can bet, that if it succeeds on a legal level, that the precedent set by this case, will be used for other cases that don’t involve terrorism and affect more recent versions of iPhones.
And that’s what scares Apple.
That if it sets a precedent by complying with this court order, then it will have to comply with all other court order, potentially thousands more, requesting the very same thing, and the perception of iPhone security will take a severe beating–after all there’s a reason drug lords and terrorist use iPhones. Let’s also accept, that if one Judge rules for Apple to comply for this iPhone, explaining to futures court Judges the technical intricacies of the secure enclave, RAM Disk implications, and digital signatures of future iPhones would be a task so monumental that from a legal standpoint this iPhone 5c is all iPhones.
The question of ‘should’ must also take into account the precedent this might set, and look beyond the specifics of this one case.
Plus it really would be burdensome on Apple to replicate what they might do in this one case, thousands of times. Everytime they do so, they’d have to sign the new operating system, which means they’d have to get their super-secret iOS signing key and digitally sign the software. Apple’s entire iOS security is premised on keeping that key secret, and since iOS operates on all of their iPhones, and iPhones are a $50 Billion dollar a year business, the signing key is quite possible the most valuable secret in the world. If you have it, you would have broken all of iOS security, plain and simple.
If you were a shareholder, would you like Apple to get out this secret key, every time the FBI, DEA or NYPD come knocking? And what’s to stop this precedent from being enforced internationally? Why would China, Apple’s largest growing market, feel that their law-enforcement isn’t entitled to the same privilege as the FBI?
How would you feel if Apple complied with the request to the FBI for this terrorist case, and then complied with other request from Israel, China, Australia…or Malaysia?
And finally to circle back to the Bank analogy, the ATM has no personal data, and the data in the iPhone doesn’t represent a clear and imminent threat, it was details of an event that had already happened.
If you were CEO of Apple what would your stand be?
Post Script
The iPhone has a hardware limit of 80ms per password attempt, which would take about 15 minutes to crack a 4-digit passcode, and 1 day to crack a 6-digit one. However, a sufficiently lengthy alphanumeric passcode (composed of both digits and characters)would still take years to crack even if the FBI obtained their special iOS, so you know what to do if you want to protect your future drug empire.
Apple has never complied to similar requests before, even though you’ve read that it acquiesced 70 times prior, that’s just technically wrong. I love Shane Harris and his rational security podcast, but he doesn’t get the nuance of this issue. First of all the number 70 is a government estimate, not something Apple agreed with, and while we don’t have the specifics of each case, some argue that this was much older iOS7 devices, which don’t have built-in encryption–which meant data could be extracted without the need for a passcode.
But think about what that means, if Apple could extract the data from the phone without a passcode, what’s to stop the likes of Russian cyber-criminals or Chinese State-Sponsored hackers from doing it as well? Hence, Apple improved the security of their software to ensure that no one without the passcode could access the contents–and what the FBI is requesting now is a tailor-made iOS that intentionally circumvents the protections, un-doing all of what Apple has engineered. This is the first time the FBI has requested this, and not the 70th time.
#Update 1: As it turns-out the iPhone actually belong to the owners employer, the county Health Department, and part of the reason we’re in the mess is that they reset the password for the Apple ID account shortly after the attack. Had they not done that, Apple may have been able to provide the data from the cloud back-up without all this kerfufle.
#Update 2: Cybersecurity ‘Legend’, John McAffee has come out and said that his team of elite hackers will be able to crack the encryption of the iPhone in 3 weeks. Now we know this is legit, because if you Google ‘Cybersecurity Legend’ his name does come up–a lot! But McAfee is legendary for the wrong reasons, apart from having his name associated with a “A barely passable virus scanning program that updates at the worst possible times. “, he’s also famous for producing youtube videos of him snorting drugs with prostitutes, running for president and, believe it or not, writing books about Yoga. But jokes aside, the offer that he makes to the Feds are that he will crack the iPhone 5C in under 3 weeks, and he’ll use social engineering. Now unless he’s able to psychologically trick a dead person to reveal their passcodes, or charm an iPhone into revealing its encryption key–this is a joke of an offer, more worthy of being reported in the Onion than on the countless media outlets who probably don’t get it.
For the best articles on the topic:
EFF Deeplinks blog, that answers all your high level technical question
Jonathan Zdziarski’s amazing post, which covers more deeply why this isn’t only for one iPhone, and why legally Apple would have to make their ‘broken’ iOS available to the defence counsel.
A second Zdziarski post on Bill Gates ribbon and ribbon-cutter analogy.