One of the oddest consequences of handing over so much of our lives to a set of complicated, interdependent, and complex technologies is that fights over our rights in that sphere are largely out of our hands. If the FBI wants a backdoor into our iPhones — devices we tend to treat like family members — there’s very little we can directly do to stop them. We need to rely on Apple to fight for us. In this sense, it’s heartening to read Apple CEO Tim Cook’s kiss-off letter to the feds, published on Apple’s website Tuesday night, refusing to grant a federal judge’s request to help the FBI break into the phone of Syed Farook, one of the San Bernardino shooters:
[T]he U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The security question at hand is relatively simple. Farook used a password-protected iPhone 5C, which the FBI would like to access. To the best of our knowledge, the only way to gain access to the phone (short of someone telling the FBI the right password) would be to “brute-force” the password — that is, guess as many combinations possible until the phone opened up. Right now, the only way to enter passwords is by hand, which would entail putting some poor junior agent in a room for the next several decades pecking out passwords one at a time. And since most iPhones will automatically erase after ten failed-password tries, that would likely not work either.
But if the FBI could load its own firmware — elementary, foundational software — onto the phone, it could theoretically bypass the ten-failure threshold, and hook the phone directly up to a powerful computer to brute-force the password, saving the life and time of our hypothetical junior agent.
This is where Apple comes in. Apple’s phones won’t allow their firmware to be updated or replaced unless the new firmware has an Apple “signature” (essentially, a complicated cryptographic protocol to determine whether the firmware is legitimate). The FBI doesn’t have the keys necessary to “sign” a firmware update or override as Apple. You can guess where this is going: The judge has ordered Apple to provide “reasonable technical assistance” to the FBI, possibly including “providing the FBI with a signed iPhone software file[.]”
The FBI insists that this would be a one-time deal: firmware for this phone, and this phone only. Apple disagrees: “The government suggests this tool could only be used once, on one phone,” Cook’s letter reads. “But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.” In other words: No.
Why would Apple say no to the FBI? The first, obvious, and indisputably correct answer is that it’s the right thing to do: “We feel we must speak up in the face of what we see as an overreach by the U.S. government,” Cook writes. He’s right: This is a case of government overreach — and the latest in a series of rulings that distort the All Writs Act of 1789 in attempts to force corporations to assist the government in creating back doors, brute-forcing, or otherwise bypassing security measures.
But it’s also worth noting that this very public letter is coming in a post–Snowden leaks context, where — following embarrassing revelations of the extent to which the NSA was tapping into supposedly private data — tech companies are jockeying to demonstrate their commitment to privacy and security.
On these issues, Apple has a particular advantage over its biggest rival, Google, which has a bad reputation with privacy experts (Snowden himself has suggested that people not use it). But this is thanks less to a particular ethos or set of values than to a business model: Apple sells objects; Google sells user data. Apple can afford to have a relatively — relatively — good track record on data-collection, privacy, and security (and it does). Google quite literally can’t. A letter and a stand reestablishing Apple’s supremacy is a good reminder to customers that Apple is on the right side of the issues they care about.
Which isn’t to suggest that Cook or his employees are taking on the FBI for anything less than the noblest reasons. Just to point out that — so long as we’ve managed to hand off our side in civil-liberties battles to for-profit corporations — we’re lucky that the right thing to do for privacy and the right thing to do for Apple’s business happen to be aligned these days.