In today’s world, the Internet is where we live the majority of our lives: We work, we shop, we chat with our loved ones. It also functions as our global town hall: a forum for debate, education and a springboard for taking action. It’s possible, when the Internet is open and secure, to live our personal and civic lives online in a way that feels safe. One of the elements most central to this feeling of safety, is in danger of being undermined: encryption.
Let’s Take a Closer Look at Apple’s Clash with the FBI
The ongoing fight between companies and authorities over encryption reached new heights recently as it was revealed that Apple has been asked by the FBI to essentially build a backdoor into iPhone software.
In an open letter published by Apple’s CEO Tim Cook, he lays out exactly what the firm has been ordered to do and calls for a public discussion on the matter of encryption. Although for many, especially those outside of the tech community, this conversation can seem impenetrable, making a public discussion that much more difficult to foster. The firm has given a deadline of Friday, February, 26 for a response. And despite a defiant open letter opposing the order from Apple CEO Tim Cook, the firm has yet to file any legal documents challenging the directive.
What Exactly has Apple Been Ordered to Do?
In December of 2015, a married couple living in Redlands, California attacked a training event held by the San Bernardino County Department of Public Health. A total of 14 people were killed, and a total of 22 were injured in the terrorist shooting.
An FBI investigation turned up an iPhone belonging to Syed Farook, one of of the perpetrators, following the event. Apple was then ordered by a U.S. Federal judge to provide investigators with access to the phone’s encrypted data, which according to the court papers, the firm ‘declined to provide voluntarily’. After this court order was issued, Apple CEO Tim Cook posted the open letter explaining the company’s opposition to the ruling. In the open letter, he describes the government’s demands as follows:
“The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.”
One of the security features which the FBI wants disabled is one which deletes all the user data after a certain amount of failed passcode entries. If investigators continue to try to guess the passcode incorrectly, they risk losing all of the information on Farook’s phone, hence why they want that particular feature disabled.
If Apple is unable to disable the auto-erase function, the court order states that the firm should create software that enables them to do so.
What is Encryption and Why is it a Big Deal?
Essentially, encryption is what keeps communications between parties safe from prying eyes. It shields sensitive data, like medical records and banking information. It also makes it possible to send confidential documents. And it ultimately enables a greater good: human rights workers, journalists and whistle-blowers can defend what’s right without placing themselves in danger. Most of us use encryption everyday without even knowing it.
The debate over data encryption and when, or even if, authorities should have the right to order its removal, has been ongoing for some time. Tensions have increased since the Edward Snowden revelations which exposed massive government surveillance in both the United States and UK. And as of most recent, encryption is under threat in the United States, as federal agencies like the FBI are calling on tech companies to facilitate access to encrypted communications.
Apple’s latest run-in with the authorities therefore seems more prominent than it would have even five or so years ago, especially as companies, including Apple, ramped up their security features in the wake of the Snowden leaks.
The End-To-End Encryption Method
With the current debate on data security, the key phrase to keep in mind is End-To-End Encryption. This is a name for a method of secure communication that stops third parties accessing data while it is being transferred from one party to another. For example, if you send a Whatsapp message to a friend, the data is encrypted until it reaches your friend so that anybody looking to intercept it will be unable to interpret it.
The End-To-End Encryption method involves using cryptographic keys to decrypt data once it has been received by the end user. Currently, these keys are stored on the end user’s device, meaning the company’s server, whether it be Whatsapp or iMessage, will simply transfer the data without decrypting it.
Any effort to transfer the key to the company itself would thereby, in essence, create a ‘backdoor’ into the user’s data. This is what concerns those most who oppose the increase surveillance powers of authorities, as such investigation could mandate the existence of such a ‘backdoor’ to help investigations.
The Apple situation is quite similar, although not directly related to end-to-end encryption. In the open letter, Cook states to following:
“For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”
Essentially, Apple is making the argument that creating a version of iOS with paired-down security features would constitute the creation of a key, which if a hacker got a hold of, would provide a backdoor into millions of users’ private information: much the same as the end-to-end encryption key debate.
This is extremely important because not only would it create a potentially devastating security threat for iOS users were it to fall into the wrong hands, but it also sets what Apple calls ‘a dangerous precedent’ by establishing the government’s right to ‘reach into anyone’s device to capture their data’.
Why Should Apple Oppose the Order?
While law enforcement agencies have a legitimate need for evidence, which is all the more pressing in terrorism cases, the Constitution and the nation’s laws limit how investigators and prosecutors can collect evidence. In a case involving the New York Telephone Company in 1997, the Supreme Court stated that the government could not compel a third party that is not involved in a crime to assist law enforcement if doing so would place “unreasonable burdens” on it.
The U.S. Federal judge’s order requiring Apple to create software to subvert the security features of an iPhone places just such a burden on the company.
To use Apple CEO Tim Cook’s words, the creation of a less secure iOS ‘would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes.’
Although the U.S. government is insisting that the software would only be used once, in the case of the San Bernardino shooting, there are wider concerns about the precendent it sets and how Apple and the authorities would ensure the software was kept secure enough to prevent others from accessing it and using the ‘master key’.
As Cook explains:
“The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.”
According to Apple, the court order is an overreach on the part of the Federal government; a solution that’s worse than the problem itself. Apple says, by building this particular version of iOS, it would be creating a ‘backdoor into its own products’, opening the door to further overreaches from government on the one hand, and large-scale hacking attacks on the other.
Also, there’s the issue of whether other governments will be able to demand the same right as the U.S. government. If Apple agrees to let the U.S. government overrule data safeguards, other governments such as China would likely demand similar rights as well.
Even if the government succeeds in forcing Apple to help, this will hardly be the end of the story. Experts widely believe that technology companies will eventually build devices that cannot be unlocked by the company engineers and programmers without the permission of users. Newer versions of smartphones already have much stronger security features than the iPhone 5c Syed Farook used.