The FBI-Apple phone cracking case

The following research was prepared as part of a class assignment, the culmination of which was an Oxford-style debate between my course on ICT Law and Policy, taught by Georgetown’s Dr. Meg Jones, and another of Dr. Jones’ classes on Privacy. Our team was assigned the position of defending the FBI’s case, which would’ve mandated that technology companies either build so-called “backdoors” into their security or at the very least aid federal agencies in cracking their security. Because of the nature of the assignment, the views in this piece are not precisely in line with my own. However, I believe the position presented raises many valid questions and more importantly demonstrates my ability to research and communicate the details of nuanced technical-legal issues. 

A heavily edited and shortened version of this piece was published as a blog post for gnovis, the academic journal which sponsored the debate.

********************

When President Obama took to the stage at SXSW, one of America’s premier technology, music, and film conferences, to stand by the federal government’s stance on commercial encryption technologies, he must have known he was entering the lion’s den. Technologists from across the spectrum – major companies like Amazon, Google, and Facebook, as well as computer security experts and hacker-types, all of whom are well represented at SXSW – have supported Apple’s refusal to create software for the FBI that would unlock the cell phone of the now-deceased perpetrator of attacks in San Bernadino, California in Dec. 2015.

Yet rather than picking a fight, adding to the increasingly tense tenor of the debate, the president took a thoughtful, measured approach, calling for a solution that balances individual privacy rights with national security. And while the number of times the word “Orwellian” has been thrown around in this debate might suggest a clear, almost universal opposition to the FBI’s case, the American public is split on the issue, depending on the poll. It is a challenging problem, and it makes sense that Americans aren’t sure what to think: the massive technological advances in the last 20 years paired with developments in global terrorism have shifted the landscape. On one hand we have the NSA spying on us, on the other, terrorist attacks and mass shootings have become almost routine.

I am one of those confused Americans. I believe that both sides have valid points and significant flaws. Much of the media’s attention, including comedian John Oliver’s recent segment on the issue, have focused on faults in the FBI’s position. But there are a few points of Apple’s that keep troubling me that I haven’t heard addressed.

There is rightfully a lot of talk going on about what encryption technology can and cannot do. But there is also a more subtle argument going on that you can see in the way we are talking about the technology. This is not the only debate in recent memory where the term “Orwellian” has been slung around. The documents revealed by Edward Snowden in 2013 released a wave of similarly accusatory language. Leaving the accuracy of those comparisons to the many hundreds of blogs and articles written on that still-unfolding drama, it’s difficult not to notice how easily the anger and rhetoric from that debate shifted to the current one. Yet these are different cases in a number of ways: the FBI and NSA are distinct government organizations, with completely different legal methods for obtaining data, different motivations (evidence vs. intelligence gathering), and completely different technologies at their disposal. While it is perfectly legitimate to disagree with both organizations independently, many who have taken Apple’s side in the current debate are tactfully pairing the two, re-directing negative public opinion about the NSA towards the FBI.

Then there are of course the similar rhetorical pairings to the Crypto Wars of the 1990s. It was in this period that the government first attempted to combat improvements in encryption technology that would prevent law enforcement from accessing digital data (an excellent history of the Crypto Wars from the privacy side of the debate). In response, in 1993 the Clinton administration developed the “Clipper Chip,” a hardware device that, when installed in mobile devices, would encrypt data while still leaving an access point for law enforcement. Although the device seemed promising at first, it quickly came under fire from technologists, civil liberties advocates, and the tech industry for several issues, including vulnerabilities of the “key escrow” accounts in which encryption keys would be stored; the cost of installing Clipper Chips into all mobile devices; under what circumstances law enforcement would be able to access data; and, after a discovery by Bell Labs researcher Matt Blaze, whether hackers could bypass the system entirely.

Although the idea of Crypto War 2.0 has been brewing for some time and many of the value issues are the same, there are significant differences between the technologies in play. The Clipper Chip was a government created and regulated system, which, if adopted, would have significant impacts on hardware/software industries; the software requested by the FBI last month would be built and controlled by Apple. Moreover, one of the most highly criticized elements of the Clinton plan, the “key escrow” system in which the federal government would store keys to all mobile phones with a Clipper Chip, is not a part of the new debate. Beyond the technology, the context is significantly different: mass shootings and ISIS were not daily concerns in 1996 (on the other hand, neither was NSA spying). Yes, there are problems with the new plan, but they are different than those of the first Crypto Wars – it’s much more complicated this time around.

In a more abstract sense, the term “backdoor” has similarly powerful effects as comparisons to the Crypto Wars and the NSA-Snowden drama, in spite of considerable ambiguity in what the decidedly nefarious sounding “backdoor” term actually means: “[A backdoor] can refer to a legitimate point of access embedded in a system or software program for remote administration,” (http://www.wired.com/2014/12/hacker-lexicon-backdoor/). This general term for “access to a computer” is often used in tandem with the idea of encryption, for which a “backdoor” would mean a key for decrypting data. The security value of the widely-used Public-Key Encryption method would be decimated by giving an agency access to these keys, as any tech security expert (including Whitfield Diffie, creator of Public-Key Encryption) will tell you (https://www.newamerica.org/oti/doomed-to-repeat-history-lessons-from-the-crypto-wars-of-the-1990s/ pg 6).

But what the FBI is asking for in this case is a different type of backdoor; it would not affect the encryption algorithm Apple uses on its phones. Since iOS8, Apple has made it standard that almost all of the data on a phone is encrypted as long as the phone is locked behind the familiar 4-6 character code password screen. Prior to iOS8, this was not the case (http://blog.cryptographyengineering.com/2014/10/why-cant-apple-decrypt-your-iphone.html ). For pre-iOS8 systems, Apple could (and did – 70 times since 2008) bypass the password screen and give law enforcement access to any non-encrypted information. So the “backdoor” that the FBI is requesting is not so much an encryption breaking, unprecedented breach of consumer privacy: it’s essentially asking for the same level of assistance that they received with previous operating systems, or maybe even less. The FBI has asked Apple to create a version of their software that does not have the 10-password limit on incorrect passwords; that allows for digital (rather than physical) input of passwords; and that will not have delays between password entries. The FBI would then use “brute-force” methods (trying every possible combination of passwords) to unlock (and decrypt) the phone.

This use of “backdoor” is reminiscent of Gillespie’s analysis of the use of the term “platform” by YouTube as a discursive tool to frame public perception of and policy action toward the company (Gillespie 2010). It may be that, following this debate, it is decided that this type of backdoor is equally as dangerous to security as a Public-Key Encryption backdoor and should be treated with the same level of skepticism. There are potentially dire consequences for a leaked “govtOS,” just as a hacked “key escrow” system. But for the moment it seems that lumping all forms of “backdoors” into one big, Orwellian pile seems a bit like tactful rhetoric from Apple, not necessarily technical reality.

The iOS8+ operating systems that produce this backdoor issue are also tied to one of Apple’s more practical arguments: that the order produced on the FBI’s behalf presents an undue burden on the company to create software to unlock the phone. There are two problems with this argument. First, Apple itself said it would only take 4-8 engineers 4-6 weeks to build the software (http://arstechnica.com/apple/2016/02/heres-how-apple-would-build-crypto-cracking-software-for-the-fbi/ ). For a company that boasted $234 billion in revenue in 2015 and has over 90,000 employees, this doesn’t really seem that unreasonable.

The trickier and more costly issues, which get to the second question in Apple’s argument, are having to either A. store this “govtOS” they would build in a way that keeps it safe from hackers and international powers, or B. destroy it and then have to rebuild it every time the FBI comes calling. For the first: considering that Apple must have had some private software that they used to access pre-iOS8 phones for law enforcement that never leaked, the former issue seems suspect. What about the process by which Apple provides iCloud data to the FBI? Certainly there would be some risk, but there is risk involved in every step of connected computing.

The latter problem of having to re-build the software for every inquiry and that precedent spreading to other agencies and other companies is surely the largest concern of Apple and other tech companies that have become come to Apple’s defense in this case. Apple has pointed to the New York District Attorney’s office, which has 175 phones in need of decrypting in active cases, as evidence that this current case will lead to Apple and other companies having to establish separate offices solely to deal with requests from law enforcement. These companies obviously don’t want that, and neither does the American public or even the government, really: we want them figuring out new and creative ways to collect data on us and sell us more stuff (but that’s another argument).

Conversely, those 175 phones could be used as evidence to show that what Apple has done has an impact not just on abstract, global terror cases but on-going cases where murderers and sexual assailants haven’t been caught. And Apple had to know that this would happen. In a controversial series of blogs for the Washington Post in 2014, legal scholar Orin Kerr called Apple’s advanced encryption on post-iOS8 systems a “dangerous game,” saying that Apple was “thumb[ing] its nose at that great tradition” of balance between privacy and security provided by the warrant application process: the new operating system “… stops the government from being able to access the phone precisely when it has a lawful warrant signed by a judge.” Although Kerr tempered his argument in subsequent posts, acknowledging the numerous concerns on the technology side presented by Julian Sanchez among others, it remains that Apple has made a decision with serious policy implications in the name of its consumers. The government’s response is hardly surprising: in effect, the new operating system is a form of what Lawence Lessig would call architectural regulation, an instance of code supplanting law (Lessig 2000). This system not only regulates user behavior by mandating encryption; it also regulates law enforcement’s ability to conduct searches in a way that has up to now been regulated by the law.

Despite a general atmosphere of technological determinism accompanying the information age and this case in particular, there is nothing “natural” or necessary about the software that Apple has created. Other regulatory modalities could push back against strong encryption technology, as is happening with new legal regulation in the UK, for better or for worse. Perhaps at a time when trust in the government is low, especially in regards to privacy, we would rather leave this decision to corporations. As Sanchez notes, a technology does not always provide “Goldilocks policy options,” so it may be that we have to stumble through this issue for a few years. If Apple is forced to provide a “backdoor,” it’s possible that a major hack or government spying will lead to a new norms/technological response, like the use of VPNs in China. If Apple wins its case, it could be that a potentially preventable terrorist attack will lead to new, Patriot Act-style legislation. If nothing else, the re-ignition of the debate by the current case will hopefully encourage the public to consider what they value, and not leave the decision to corporations and policy makers without having a say.

Gillespie, Tarleton. “The Politics of ‘platforms.’” New Media & Society 12.3 (2010): 347–364. nms.sagepub.com. Web.

Lessig, Lawrence. Code: And Other Laws of Cyberspace. New York, N.Y.: Basic Books, 2000. Print.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s