Please enable JavaScript to view this website.

Securing the Glass House: Apple, Encryption, and the Fight Over Backdoors (Part I)

This summer, as the world eagerly awaited the latest i-offerings from Cupertino, the city associated with the latest in cutting-edge and fashionable tech from computer behemoth Apple Inc., James Comey, the current director of the Federal Bureau of Investigation, published a list of concerns online regarding recent evolutions in encryption.1 According to Director Comey, the “logic of encryption will bring us, in the not-to-distant future, to a place where devices and data in motion are protected by universal strong encryption.”2 While he acknowledged the potential of such “digital lockboxes” to protect the secrets of millions of Americans, he warned against the potential misuse of such devices and technology to enable possible criminal communication in a post-terror world.3

Director Comey’s concerns were illuminated when the New York Times published an article almost two months to the day, alleging that Apple Inc., in response to a court order obtained by the Justice Department during the summer of 2015 to produce, in real time, communication between suspects in a firearm-and-narcotic investigation using the iPhone to text message, said its proprietary iMessage service was encrypted and the company could not comply with the order.4 Apple’s reticence to allow government access to a “backdoor”– the industry term of art for a “method bypassing authentication or other security controls in order to access a computer system or the data contained on that system” – reflects another reality in the evolution of encryption.5 Simply put, the company does not have a way to do so.6

Apple’s iMessage and Facetime services use end-to-end encryption through the use of public keys, which allow one user to send and receive messages with another user or users; during this communication, invisible bits of code exchange between the message participant(s) and enable those involved to be sure both that the messages are authentic and come from the participant(s).7 Since the encryption process is tied to the physical unit, and Apple does not backup copies of these messages, the company cannot deliver information on the content of messages in response to court orders.8 Additionally, full device encryption – meaning a locked device cannot be accessed by someone who does not know the rightful owner’s password – on the iPhone means that, even if a user’s physical device was held in custody, there would be not on-device access to pictures, media, or contact information potentially useful in the context of an investigation.9

The terrain populated by Apple, other tech companies who employ end-to-end or full device encryption, and the government has largely been shaped by the regulatory efforts of Congress, which passed, in 1994, the Communications Assistance for Law Enforcement Act, mandating that “telecommunications companies … build into their systems an ability to carry out a wiretap order if presented with one.”10 In the twenty-one years since enactment, however, the law has not been updated to cover other ubiquities of modern communication, including email and the contents of a smartphone.11 Furthermore, the revelations by former intelligence contractor Edward J. Snowden regarding massive collection of metadata by the U.S. government has simultaneously contributed to backlash against expansion of the existing law and helped create a landscape where companies such as Apple aim to convince global consumers that their data is secure.12

Apple is not unaware of the distinguishing effect of being one of a few companies who can offer this promise to consumers. In a published privacy policy on its website, Apple says it has never cooperated with a government agency to establish back door access, allowed government access to its servers, and that it would never make this move.13

In this relatively uncharted territory, the regulatory void illuminates the need for clarity, but it is not clear that the traditional dichotomy of privacy v. security captures the problem. In the words of George J. Terwilliger III, a lawyer who worked on wiretapping issues in the Justice Department when analog phone networks evolved into digital ones, “If you ask about wiretap functionality in the broad privacy context, you get one answer … if you ask it in the context of a guy with a loose nuke or some kind of device, you get a different answer.”14 Additionally, giving government backdoor access does not necessarily prevent data breach; the same vulnerabilities utilized to comply with a court order could be used by nefarious individuals or foreign governments to gain access to sensitive user information.15 Apple CEO Tim Cook illustrated the nuance in a recent address at a technology conference, stating, “if you put a key under the mat for the cops, a burglar can find it, too . . . if criminals or countries know there’s a key hidden somewhere, they won’t stop until they find it.”16

The conversation begun by Snowden and spurred on by advances in encryption and the consumer response to more secure devices shows no signs of abating. The metaphorical glass house of privacy remains guarded by Apple’s refusal (and inability) to comply with government access. Admittedly, a glass house is fragile, its integrity possibly compromised by would-be-snoopers. It is in this paradigm that Apple, consumers, and the Justice Department crowd together at the doorstep.

 


  1. See James Comey, Encryption, Public Safety, and “Going Dark,” LAWFARE (Jul. 6, 2015, 10:38 AM), https://www.lawfareblog.com/encryption-public-safety-and-going-dark. 

  2. Id. 

  3. Id. 

  4. See Matt Apuzzo, David E. Sanger, & Michael S. Schmidt, Apple and Other Tech Companies Tangle With U.S. Over Data Access, N.Y. TIMES (Sept. 7, 2015), http://www.nytimes.com/2015/09/08/us/politics/apple-and-other-tech-companies-tangle-with-us-over-access-to-data.html. 

  5. VERACODE, STATIC APPLICATION OF APPLICATION BACKDOORS, available at http://www.veracode.com/sites/default/files/Resources/Whitepapers/static-detection-of-backdoors-1.0.pdf 

  6. See Matthew Green, A Few Thoughts on Cryptography, (Sept. 8, 2015, 8:55 PM), http://blog.cryptographyengineering.com 

  7. See id. 

  8. See Brandon Hill, Apple Runs Afoul of US Justice Department Over Encrypted iMessages, Has No Plans to Back Down, HOT HARDWARE (Sept. 8, 2015), http://hothardware.com/news/apple-runs-afoul-of-us-justice-department-over-encrypted-imessages. 

  9. See Id. 

  10. David E. Sanger, & Brian X. Chen, Signaling Post-Snowden Era, New iPhone Locks Out N.S.A, NY TIMES, (Sept. 26, 2014), http://www.nytimes.com/2014/09/27/technology/iphone-locks-out-the-nsa-signaling-a-post-snowden-era-.html 

  11. Id. 

  12. Id. 

  13. See Apple Inc., http://www.apple.com/privacy/government-information-requests/ (last visited Sept. 21, 2015). 

  14. See Apuzzo, Sanger, & Schmidt, supra note 4. 

  15. Id

  16. Ronald Bailey, Apple iPhone 6s Announcement Tomorrow: Encryption is Already Its Best Feature, REASON, (Sept. 8, 2015, 11:14 AM), https://reason.com/blog/2015/09/08/apple-iphone-6s-announcement-tomorrow 

The following two tabs change content below.