Silicon Valley, Spies, and Empathy
Earlier this week, MIT welcomed a distinguished guest to speak as part of its Internet Policy Research Initiative: Robert Hannigan, director of GCHQ, Britain’s version of the National Security Agency. Extending an olive branch to a fiercely skeptical audience, Hannigan tried to clarify the elusive balance between cybersecurity and national security when it comes to encryption.
His talk in Cambridge took place amid a raging legal battle between Apple and the FBI regarding the tech giant’s refusal to unlock an iPhone belonging to one of the San Bernardino shooters. The FBI needs Apple to modify the operating system on the attacker’s phone so that it can gain access to its contents; without doing so, the phone’s contents would be automatically erased after ten failed attempts to guess the shooter’s password. By circumventing this automatic self-destruct mechanism, an Apple-written “backdoor” to the device would enable the FBI to gain access to the phone by trying every possible password.
Apple argues that the FBI is imposing an unfair burden on the company and is violating its right to freedom of speech. It’s relying on prior cases in which software code has been deemed a form of speech. Major tech firms including Google, Amazon, Facebook, and Microsoft have lined up in support of their traditional rival.
Silicon Valley fears that a ruling favorable to the FBI would set a precedent that ultimately undermines the tech industry’s ability to secure its products. Given any “backdoor” access available to law enforcement might be exploited for nefarious purposes, customers might grow wary of using these devices. Ultimately, the move could undermine both the industry’s and the country’s economic interests.
The GCHQ director stressed he did not come to offer a panacea to such dilemmas. Instead, Hannigan argued, solutions would have to be diverse and dynamic to address different contexts and shifting circumstances.
In a controversy that often is framed as a binary (privacy vs. security, backdoor vs. locked), he suggested all involved might benefit from looking beyond a simplified black and white characterization. Engineers asking questions after his talk seemed fiercely skeptical. One cryptographer asked why he shouldn’t include Britain’s spy agency in his "threat model," given allegations of abuses. Another attendee questioned the director’s assertion that TOR, an anonymous networking tool, was predominantly used by criminals. A third expressed skepticism toward the British courts' Orwellian distinction between “bulk collection” and “bulk surveillance.”
The lively debate was a testament to the increased public scrutiny heaped on spying agencies since Booz Allen Hamilton contractor Edward Snowden leaked documents detailing the practices of the NSA and GCHQ. Although many had suspected the scale of these surveillance efforts was enormous—in part thanks to earlier whistleblowers like William Binney and Thomas Drake—Snowden’s files drew public attention to the tension between privacy and large-scale information gathering in the name of security. But how can we debate the trade-offs between the cybersecurity of individual devices and our national security interests? Aren’t both important?
The case of the San Bernardino shooter’s iPhone has highlighted the intractability of this predicament more clearly than ever. Given the critical importance of both cybersecurity and national security, prioritizing one over the other is no trivial task. Indeed, FBI Director James Comey said it was the "the hardest question I've seen in government,” and Hillary Clinton called it the “worst dilemma ever.” Clearly, there are no easy or obvious answers.
Apple supporters argue that mandating restrictions on security would endanger those that rely on the encryption, including journalists and activists in foreign countries. Indeed, as Bloomberg View columnist Eli Lake has pointed out, the US government has helped develop and spread user-friendly encryption technologies for precisely this reason—to support dissidents from China to Russia. Critics argue that such strong encryption enables criminals to communicate without fear of detection.
Given the situation’s complexity, the GCHQ director's call for open, calm, nuanced public dialogue appears sound. Whenever participants in a debate frame problems in black and white and call for silver-bullet resolutions, it’s best to summon the most potent superpower we humans have: empathy. It’s essential that we try to see and feel the perspectives of those with whom we disagree.
The GCHQ director's call for open, calm, nuanced public dialogue appears sound.
It may well turn out, as advocates of tech firms suggest, that there is an unacceptable security trade-off in accommodating law enforcement’s demands. But it’s also possible the opposite is true – might technology prevent law enforcement from ensuring our safety? So before you dig your heels in to a position and defend it at all costs, try to see the world differently. In the privacy camp defending cybersecurity at all costs? How would you feel if you had family members killed in the San Bernardino attack? Or if threats were made today against you or your family? Convinced that law enforcement should take priority? What if hackers could track your every move, every communication? Would that make you feel safer or more vulnerable?
We humans tend to prefer clearly defined and certain situations rather than more ambiguous ones. Seeing the world through the eyes of those with whom we disagree is one way to appreciate complexities. Indeed, adopting multiple perspectives may offer our best hope for finding common ground. When it comes to cybersecurity, privacy, and national security, doing so may lead us, as Hannigan suggested, to a hodgepodge of partial, diverse, and impermanent solutions. As unsatisfying as that may seem, it may be the best we can hope for at this point in time.
Taking a step back, analyzing the situation from multiple perspectives, and empathizing with those with whom we disagree is the best chance we have to tackle this and the many other conundrums we will increasingly face in the hyper-connected 21st century.
Vikram Mansharamani is a Lecturer at Yale University in the Program on Ethics, Politics, & Economics. He is the author of BOOMBUSTOLOGY: Spotting Financial Bubbles Before They Burst (Wiley, 2011). Visit his website for more information or to subscribe to his mailing list. He can also be followed on Twitter or by liking his Page on Facebook.