One school of thought years ago was that the Large Hadron Collider, the enormous contraption built to bash particles together, might inadvertently become a doomsday device.
The collider has been in operation since 2008 and hasn’t yet destroyed the Earth by creating microscopic black holes or wreaking other planetary mayhem. But now, according to Apple CEO Tim Cook, an almost equally dangerous threat has emerged: that the iconic tech company might cooperate with the FBI in a terrorism investigation.
The FBI wants access to the iPhone of San Bernardino terrorist Syed Rizwan Farook, and Cook is resisting and putting his refusal in apocalyptic terms. Should Apple comply with a judge’s order to help the FBI, we’re supposed to believe, it will have created the privacy equivalent of a doomsday device, making everyone vulnerable to the intrusions of government and depredations of hackers and criminals.
This is trite marketing — only Apple can save us from Big Brother, and by the way, please keep buying our phones — masquerading as bravery.
In the San Bernardino case, one wonders whose privacy Apple thinks it’s protecting. As former prosecutor Andrew McCarthy notes, Farook is dead. There is no doubt that the government has probable cause to search his phone. And the phone in question didn’t even belong to him. It is the property of his former employer, the San Bernardino County Department of Public Health, which had the right to search it at will and is fine with the FBI gaining access to it.
Apple wants to give the impression that the key to Farook’s phone is the key to the kingdom, but it’s not so. As Timothy Lee explains on the website Vox, the FBI doesn’t need to defeat the encryption on Farook’s phone and thus, in theory, endanger the encryption on other phones. It just needs to get into the phone. For that, it needs to get past the first line of defense, the device’s passcode.
This is possible through what is called “brute force.” A robot can punch in every possible combination until the phone unlocks. Except the iPhone has various security features to defeat anyone who doesn’t know the passcode. The FBI wants Apple to change the software on Farook’s phone so it can force the device open.
This isn’t Armageddon. As Lee writes, “Apple has tacitly admitted that it can modify the software on Farook’s iPhone to give the FBI access without damaging the security of anyone else’s iPhone.” No one is going to have his or her privacy compromised because the FBI, in this one instance, with heroic technological exertions, gets into a dead terrorist’s phone.
Apple’s position is basically, Don’t worry about ISIS, don’t worry about tracking down every possible lead in a terrorism case, don’t worry about the families of the victims, who, of course, want the mass murderer’s phone unlocked. Worry about what cooperating with the government might do to the company’s reputation, and about a parade of horribles that might ensue if we slide from here several miles down a slippery slope.
This case shouldn’t mean an inevitable downward slide. Susan Hennessey and Benjamin Wittes argue persuasively on the blog Lawfare that Apple’s warning about a “dangerous precedent” is inapt, since if cooperation is appropriate in this case, the only precedent is for cooperation in other appropriate cases.
If Apple wants to avoid getting coerced by Congress into building a real “backdoor” to all its software going forward, a much larger and more fraught proposition, it has embarked on a foolish course by choosing such an unsympathetic test case. The chances are high that Apple will be seen to be acting unreasonably — since it is acting unreasonably.
Apple’s contribution to American life is a product that is almost impossible to live without. The company is synonymous with sleek and cool, and has an enormous reservoir of goodwill. It shouldn’t diminish it by staking out an indefensible position and elevating it to faux high principle. Unlock Syed Rizwan Farook’s phone.
Rich Lowry can be reached via e-mail: comments.lowry@nationalreview.com.