F.B.I. v. Apple: Privacy Concerns and National Security Interests Face-Off

February 19, 2016

Secure appleOn February 16th, Apple posted a Customer Letter on its website detailing its opposition to a federal court order that would require it to assist the F.B.I. in hacking an iPhone used by one of the shooters in the San Bernardino attack. The order, issued by U.S. Magistrate Judge Sheri Pym, asks Apple to write software that would allow the government to “break into” the phone without risking the loss of data; Apple has refused to attempt to create a “backdoor” into its own security systems.

The iPhone, used by Syed Rizwan Farook, is currently in the government’s possession, but the data within the phone is locked by a passcode known only to Farook. While Apple has already provided data that was backed up on iCloud servers, the F.B.I. claims access into the phone itself is essential because the iCloud back-up feature appears to have been intentionally turned off by Farook approximately 6 weeks prior to the attack. According to court filings, investigators want to unlock the phone with the assistance of a computer program that will “guess” the millions of combinations necessary to find the winning passcode. The problem is that investigators can’t use this program until Apple creates a way around iPhone security features which will erase data from the phone after 10 failed login attempts. The government has further promised that the software they are requesting from Apple would only be used in this one case, on this specific iPhone.

Previously, investigators could tap into a device’s hardware port to access encrypted data, but Apple made changes in 2014 that now make it nearly impossible for forensic investigators to work around its passcode system. Apple has complied with court orders to retrieve data from iPhones running earlier versions of its operating system, but Farook’s phone ran on IOS 9, which was built with default device encryption. When a password is created, that phrase generates a key that is used in combination with a hardware key inside the phone. Together, the keys encrypt the device’s data. Apple CEO Tim Cook noted a software around the passcode would not only weaken iPhone security in general, but would threaten the safety and security of every iPhone, “Once created, the technique could be used over and over again on any number of devices . . . it would be equivalent to a master key.” Cook urges that it would simply be too dangerous to create such a tool.

Ultimately, Cook has made valid points and taken a stance that many agree with. Though the government promises to use this software solely on Farook’s iPhone, it’s easy to see the “slippery slope” argument and acknowledge that promise cannot be guaranteed (and is unlikely to be kept). Some argue the government is not asking for a backdoor, but attempting to come in through the front door with Apple’s assistance; the software wouldn’t help someone break into any device anywhere, actual possession of the device would be required. However, considering the government may not even have the authority under the All Writs Act to issue orders like the one in this case, and the dangerous precedent allowing the court to compel a software provider to create any software requested; the outcome of this debate will have far-reaching legal implications.