Apple, the cleverest company on the planet, claims it can’t unlock one dead terrorist’s iPhone without exposing all of our phones to malevolent hackers.
That’s very hard to believe — no, impossible to believe.
Yet it is the core of Apple’s argument in fighting a federal judge’s order to help the FBI crack the brain of the iPhone 5C used by Syed Rizwan Farook, one of the San Bernardino mass shooters.
Apple might ultimately prevail in court, but morally this is an awful case on which to crusade as corporate guardians of citizen privacy.
A radicalized American-born Muslim, Farook and his wife, Tashfeen Malik, shot down 36 people during a holiday party at a social services center on Dec. 2. Fourteen of the victims died.
Police killed Farook and Malik in a wild street shootout, after which thousands of rounds of ammo and bomb-making materials were found. Since then, the FBI has been searching for clues to possible co-conspirators, or to any communications with jihadist groups such as ISIS.
A critical piece of evidence is the encrypted iPhone used by Farook. Ironically, he didn’t even the own the device; it was provided to him by his employer, San Bernardino County.
At the request of the Justice Department, a judge last week ordered Apple to bypass the software that restricts attempts at trying different passcodes on the phone. Normally the device would erase all personal data after 10 password tries.
The FBI says that Farook could have used any of two billion-plus passcode combinations to lock his iPhone, and that a computer might need five years to run through all the possibilities.
But before the search can begin, the agency says it needs Apple to electronically open a “back door” into Farook’s phone. Apple says no way. Think about this. The man who used the iPhone is extremely dead and also a mass murderer, which even in a free society tends to mute the privacy issue.
There also exists the possibility that other individuals were involved in Farook’s act of slaughter, or encouraged it from afar. That information would be highly useful to law enforcement — not just for prosecution purposes, but also for stopping future attacks.
If it led authorities to other terrorists, Farook’s private data could actually save innocent lives.
Apple says that’s not a good enough reason to unlock the man’s phone. To which many Apple customers like myself would reply: Are you people serious?
Much of the important information on Farook’s phone was also stored by Internet servers and wireless carriers, and has been accessible to the FBI. This includes records of phone calls, emails, SMS texts and interactions on social networks.
Data stored by Farook on the popular iCloud service was turned over by Apple to authorities. However, Farook did an iCloud backup for the last time on Oct. 19.
That means some data from the 44 days preceding the San Bernardino shooting remains only on the iPhone. These files would be found under Photos, Contacts, Videos, Notes and Messages.
Apple CEO Tim Cook insists that unlocking Farook’s iPhone would set a dangerous precedent, creating a pathway for crooks and hostile governments to penetrate personal devices.
It’s actually more a battle about principle than practicality, because Apple is very good at keeping programs secret. The Farook litigation is part of a running conflict with the Obama administration, which has pressured Apple, Google and other tech titans to be more cooperative in national security investigations.
We all value our private lives, and dislike the idea of anyone — including government agents — snooping inside our PCs, laptops or smart phones. But we also dearly want to live in a place that’s safe from violent monsters, homegrown or otherwise.
Striking that balance isn’t easy, but some circumstances are clear as a bell. Every day, judges issue search warrants in serious criminal cases. If Farook had kept a written notebook of ISIS associates, who would complain if the FBI confiscated it as evidence from his apartment?
Suppose such crucial information exists deep inside Farook’s iPhone. It’s inconceivable that Apple with its unmatched genius can’t come up with a secure way to crack that device — and only that device — to give investigators a 1-in-2 billion chance at access.
Surely this can be done without compromising the privacy of every iPhone in the world. This should be a no-brainer at a company so famous for its brains.