It has now been six months since FBI Director James Comey said that " e
ncryption threatens to lead all of us to a very dark place." Since then, the FBI, Department of Justice, President Obama, and NSA have all taken potshots at encryption, each of them suggesting that the risk of criminals using the technology to hide from law enforcement outweighs the benefits of ordinary people wanting to keep their data and communications private.
The frustrating thing about all of this is how little the conversation has changed in the last six months. Comey and his counterparts at the NSA keep saying that they want lawful, technologically sound ways to access encrypted data if they are given permission to do so by a judge. People who understand the technology keep telling them that such a system is not possible.
"A serious debate is happening in an environment that's almost devoid of technical input"
Let's just reiterate that for a moment. In order to create alternate ways of accessing encrypted data, necessarily you must create a security hole or a backdoor into that data. When you purposefully create security holes, those holes can be exploited by others (i.e. not the FBI or the NSA). Therefore, is it really still encryption anymore?
"The notion that electronic devices and communications could never be unlocked or unencrypted—even when a judge has decided that the public interest requires accessing this data to find evidence—is troubling," FBI Executive Assistant Director Amy Hess wrote in a Wall Street Journal editorial. "It may be time to ask: Is that a cost we, as a society, are prepared to pay?"
She said the move to "ubiquitous encryption" will usher in an era in which criminals will run free after hiding incriminating evidence "without fear of discovery by the police."
It's time for the FBI and NSA to tell us what they really want. Because for the last six months, both agencies have been repeatedly asking for something that is simply technologically impossible. When confronted with that fact, the agencies resort to the sort of rhetoric that shows up in Hess's editorial and in Comey's speeches. They favor "robust encryption as a key tool to strengthen cybersecurity," but what does that mean? Who can have encryption, and what kind of encryption can they have?
It's worth noting that, until recently, the FBI recommended that you encrypt your phone. It's also worth noting that the man who wrote the Patriot Act thinks you should be allowed to use encryption.
NSA Administrator Michael Rogers has proposed what is known as a "split key" system—one in which a phone manufacturer would create an extra encryption key and then distribute its "parts" to different entities. It's kind of like escrow—someone holds the key to unlock your phone or your email or whatever. If the NSA or FBI gets a warrant to decrypt the data, it'll go to that escrow holder and get the key, and then have access to all of your data.
How are you going to make all these keys and keep them separate? What happens if someone gets ahold of them?
The problems with this suggestion are numerous. Who holds the key? Who can you trust with the key? Joseph Lorenzo Hall, chief technologist with the Center for Democracy and Technology, calls it "not a serious proposal," for lots of reasons. The FBI and NSA act as though the United States is the only country in the world that wants access to encrypted data. It's not.
American companies are overwhelmingly dominant globally, and companies like Apple, Google, Facebook, and Twitter make heaps of cash overseas. Facebook and Twitter both have a history of caving to the demands of autocratic governments when faced with the possibility of being shut down. So, if the US gets its "golden key" for WhatsApp users, does Turkey get one too? Does Pakistan? Does China? Does Russia? How are you going to make all these keys and keep them separate? What happens if someone gets ahold of them?
And what happens to those
American companies who are shipping products globally with built-in backdoors allowing US law enforcement to access user data? Such a provision doesn't seem likely to go over well in, say, Germany.
This conundrum is the exact same one that the US ran into back in 1997, Hall wrote:
"We demonstrated [in 1997] that there would be no provable secure way to communicate using split key key escrow systems, so certain types of sensitive transactions involving health information, financial information, and intimate information would be more vulnerable to interception in the case of a flaw, compromise, or abuse of the system. Also, securing repositories of keying material, validating requests for keys, and distributing keys would be exceedingly complex, and likely much more complex than the underlying encryption itself.This is costly to say the least, but it can also be dangerous in that adding complexity to a system will inevitably lead to additional methods to undermine it and find vulnerabilities that can be used to attack it."
This is also the exact same conclusion reached by Matthew D. Green, a security researcher at Johns Hopkins University. Here is the premise of a recent blog post he wrote: "Let's pretend that encryption backdoors are a great idea. From a purely technical point of view, what do we need to do to implement them, and how achievable is it?"
Green's entire blog post is worth reading, because he does outline several ways in which such a system could be implemented. Each of those backdoors essentially amount to attacks on encryption that would A) not work, B) be ridiculously expensive and difficult to implement, or C) create unnecessary and exploitable vulnerabilities. Green is widely seen as one of the best in the business when it comes to this stuff. He is at the top of his field and is widely respected. Basically, he knows his shit.
The NSA and the FBI are not offering technical solutions, they are not offering alternatives, they are fear mongering.
And here is the conclusion he reaches:
"If this post has been more questions than answers, that's because there really are no answers right now. A serious debate is happening in an environment that's almost devoid of technical input, at least from technical people who aren't part of the intelligence establishment."
The Washington Post notes in an article outlining the split key idea that both the NSA and the FBI won't or can't name one single instance in which they were unable to thwart a terrorist or punish a criminal because they couldn't break encryption. Likewise, the NSA and the FBI are plugging their ears and screaming about "bad guys" and "darkness" when it comes to encryption. They are not offering technical solutions, they are not offering alternatives, they are fear mongering.
So, what does the intelligence community want? Do they want to ban encryption for consumers? Do they want to create a system of weak encryption that can be attacked by any hackers with a modicum of skill?
It is very likely that sometime in the future, a criminal will walk free because he had the foresight to encrypt incriminating evidence. Thing is, there are far more crimes committed against ordinary people who don’t use encryption than there are criminals who will walk free . Does the intelligence community think that a future without locks is going to make us safer overall?
No comments:
Post a Comment