Please enable JavaScript to view this website.

Securing the Glass House: Apple, Encryption, and the Fight Over Backdoors (Part II)

The conversation the previous blog post in this series discussed shows no signs of abating, although recent news suggests evolution with respect to the government’s position on backdoor access. Within the past three and a half weeks, it has come to light that the Obama administration, as suggested by James Comey, the current director of the Federal Bureau of Investigation, will not ask Congress for a bill requiring the tech industry to install backdoor access in their products that the government could use to access sensitive data. (See David Kravets, Obama administration won’t seek encryption-backdoor legislation, ARSTECHNICA (Oct. 9, 2015, 4:00 PM), Director Comey, speaking at a Senate panel of the Homeland Security and Governmental Affairs Committee on October 8, 2015, suggested the government would instead focus on lobbying the tech industry as part of a continuing conversation to make locked devices available for criminal and terrorism investigations.1

Director Comey’s statement comes as some surprise to industry watchers, many of whom were expecting the government to pursue legislation to cure the gap between existing laws requiring companies to build-in access pursuant to a government wiretap and the changed, post-Internet landscape.2 As recently as this summer, Director Comey was quoted as being in favor of backdoors.3 Given his former position, what accounts for the change? Furthermore, what does this portend for the encryption conversation?

Over the summer, a secret government working group on encrypt developed an initial memo detailing policy objectives and an analysis of potential technical approaches companies could use to improve government access to encrypted data.4 The memo identified four lessons: first, there could be no ‘one-size-fits-all’ technical approach, as “each particular company would need to implement approaches specific to their” current implementation of encryption on their devices; second, different encryption implementations require different approaches, specifically regarding the categories of “data stored on devices held by consumers; the encryption of communications in transit between parties; and the encryption of data stored in remote locations”; third, intended use cases should drive proposed technical strategies; and fourth, technical approaches to access could be enforced in several ways.5 In addition to the lessons above, the memo identified three technical challenges lacking a viable solution: first, strong encryption is increasingly becoming the industry expectation worldwide; second, encryption services rely heavily on open source software, meaning there generally is not a central authority that could ensure compliance with proposed legislation through updates; third, more-secure encryption could be layered atop less-secure encryption, meaning that an individual could theoretically use Skype or another messenger service to nevertheless defeat government attempts to access their iPhone, provided Apple allowed (in the future) device decryption pursuant to a warrant. ((See id.))

In light of the numerous challenges, the working group recommended nine principles to help direct the Unites States’ government on the continuing discourse with the private sector regarding encryption6 The memo suggests the working group released their findings with a keen eye to affecting the public conversation, and not just the private one between government and industry: “while all of the principles should inform private discussions with industry, some, or all of them could be incorporated into any public debate.”7 At outlined, the memo suggests that the government focus on targeted access instead of bulk collection, abandon unilateral backdoor access for third-party assisted access, push for regulation of government data collection through industry action and not procedural protection, insist on the adoption of any U.S.-proposed solution by other nations, encourage industry to develop accessibility regimes that maximize security while minimizing complexity, push companies to design regimes reducing the impact of backdoor exploitation, spur industry to respond to the governmental need for access while seeking to minimize any potential negative impact on innovation, leave providers (and not government) with the task of designing workable approaches to their own systems, and avoid disturbing the delicate psychology of trust associated with online communications.8

When questioned about the language of the memo, National Security spokesperson Mark Stroh sought to distance the administration from the any of the proposed technical solutions, noting that while the administration “firmly supports the development and robust adoption of strong encryption . . . . the use of encryption by terrorists and criminals to conceal and enable crimes and other malicious activity can pose serious challenges to public safety . . . . [and] the administration continues to welcome public discussion of this issue as we consider policy options.”9

Despite the seeming retreat from a more entrenched position on the need for governmental access to backdoors, there remains evidence the debate has not been put to pasture. At a recent Wall Street Journal technology conference in Laguna Beach, CA, on October 20, 2015, Apple CEO and Tim Cook and NSA Director Admiral Michael Rogers offered contrasting views on encryption, privacy, and national security.10 Cook, whose company has emerged as a sort-of champion of individual e-privacy, reiterated what he’d said on numerous occasions when questioned about his privacy stance: “We’ve said that no backdoor is a must, and we’ve said that encryption is a must.” ((Id.)). Regarding the government’s argument that it needed such assess, Cook stated, “You’re saying, ‘They’re good, so it’s okay for them to know . . . . But that’s not the state of today. If someone can get into data, it is subject to great abuse.”11 In contrast, Director Rogers struck a more cooperative tone, while highlighting a subtle distinction: “Strong encryption is in our nation’s best interest . . . . we can’t do that in a world in which both sides castigate each other, ‘I’m good, you’re bad.’”12 As noted by Fortune, the operative use of strong instead of full in the realm of encryption suggests the government would prefer access to private user information instead of full encryption, which would render the data unreadable. 13

The conversation on encryption has both a visible and behind-the-scenes component. While Apple, the tech industry, consumers, and the government attempt to come to some understanding regarding who (if anyone) should have a ‘second pair’ of keys to the glass house of privacy, it is worth noting that evolutions on policy (here, in the context of the government’s memo and subsequent statements) do not necessarily foreclose institutional desires for access. Rather, they emphasize and implicate enduring conversations to be had regarding regulation, innovation, and the very concept of trust itself.

  1. See id. 

  2. See id. 

  3. See James Comey, Encryption, Public Safety, and “Going Dark,” LAWFARE (Jul. 6, 2015, 10:38 AM), 

  4. See Read the Obama Administration’s Draft Paper on Technical Options for the Encryption Debate WASH. POST, available at 

  5. See id. 

  6. See id

  7. See id. 

  8. See id

  9. See Andrea Peterson, Ellen Nakashima, Obama administration explored ways to bypass smartphone encryption, WASH. POST (Sept. 24, 2015), added 

  10. See Kia Kokalitcheva, Apple CEO Tim Cooks says no to NSA accessing user data, FORTUNE (Oct. 20, 2015, 4:27 PM), 

  11. Id. 

  12. Id.)(emphasis added 

  13. Id.