Wednesday, July 4, 2018

Lawmakers should accept reality that digital communication can never be 'too secure'

TRENDING:

 

Lawmakers should accept reality that digital communication can never be 'too secure'

   
Lawmakers should accept reality that digital communication can never be 'too secure'
© Getty
The balance between security and privacy is more delicate than ever. Faced with global terrorism threats, the American people and courts have largely accepted infringements like warrantless bag searches in mass transit systems as a necessary evil. On the other hand, growing cyber risks and continuous misuse of user data by service providers have evoked visceral public reactions to practices resembling “mass surveillance,” influencing tech companies to beef up security in their offerings and the Supreme Court to rule in favor of stronger privacy protections.
The latest tide of data protection improvements in communication apps and devices has fueled what may be the loudest security vs. privacy debate about whether the U.S. government can mandate extraordinary access or so-called “backdoors” in every technology to circumvent the progress the industry has made in protecting personal and business data. And while the majority of voices are debating the “can we” issue, the question of “should we” deserves as much if not more attention.
The majority of security experts have long argued that a “secure” backdoor that does not represent a net negative risk to the end user cannot be designed. However, it isn’t simply a question of whether we can find an acceptable technical solution to enable extra government access, it’s a matter of freedom, slippery slope decision-making and the kind of country we are prepared to live in.  

In January, FBI Director Wray characterized law enforcement’s inability to access encrypted data as an “urgent public safety issue.” Deputy Attorney General Rosenstein encouraged lawmakers to pass legislation to force technology providers to re-architect their systems to enable government access. In 2016, the FBI served, then withdrew a legal demand to force the redesign of the iPhone’s security to facilitate the San Bernardino investigation. At the time, Apple resisted that order arguing that the demanded functionality was “too dangerous to create,” a position that other tech companies supported.
I have been close to this issue for most of my professional life. As a law enforcement officer and computer crime investigator, for nearly a decade, I executed search warrants against criminal suspects, conducted electronic surveillance and performed forensic analysis on digital evidence. In that capacity, at times, I saw the Constitution, despite its genius, as my toughest adversary, making catching criminals much harder. But as an information security professional, I also knew that criminal threats facing people and businesses I was serving to protect are real and relentless. And for those of us in the non-stop arms race against state and non-state attackers, there is never room for compromise.
As I teach digital privacy and security to undergrads, I have come to see how often new technologies become the harbinger of emerging privacy issues. Near the end of the 19th century, for instance, the invention of the portable camera and the rise of “yellow journalism” are said to have inspired Warren and Brandeis’s 1890 Harvard Law Review article “The Right to Privacy.” That piece provided a legal basis for the individual’s right to privacy in the United States.
The question we face now is do our rights to Life, Liberty and the Pursuit of Happiness extend to securing our private information and property when and how we see fit? When viewed in proper light, the Fourth, Fifth, and Sixth Amendments to the U.S. Constitution define a default government position as deferential to individual privacy. Take the Fourth Amendment, which protects people from unreasonable search and seizure, it clearly recognizes one’s business at home their own unless there is compelling evidence to warrant the trespass.
While those who argue that backdoors do not violate the Fourth Amendment may be technically correct to the extent that the government won’t actually exploit the backdoor without a warrant, they are missing a more important point. The Fourth Amendment obliges the government to obtain a search warrant, which provides the authority to overcome an individual’s privacy and search for information related to criminal activity. It does not oblige the citizen to possess the particular information sought by the government.
Flip these obligations around and we find ourselves on the edge of an extremely slippery slope. Do we now require everyone to make all of their actions available for government review in perpetuity? To be clear, we’re talking about individual citizens in their private affairs — not employees of regulated organizations or public servants conducting official business.
If the government can force service providers to implement an encryption scheme that grants access to private citizens’ data at some point in the future, what happens when customers decide to start deleting data so that it can’t be accessed? Would we be collectively ok with government having the power to force people and companies to save copies, just in case it’s ever needed? Would this edict extend to every computer file, every message, screen sharing over the web, and every phone call? If so, it is more than the power to investigate or protect. It becomes life on the record.
The slippery slope gets scarier as you go down. Can we outlaw gloves to ensure that fingerprints are left at every crime scene? Why not require all citizens to be tracked at all times? When we develop a device that can read another person’s thoughts,  could we outlaw the tin foil hat to ensure that everyone’s thoughts are accessible to the government? Is anything out of bounds if the goal is security? Tread carefully the divide between can and should.
Finally, in a free society, do tech providers serve the interests of the customer or the government? Assuming one has the right to use “unbreakable” encryption, do they not also have the right to have someone build it for them? If, for argument’s sake, we say that American companies aren’t allowed to build it, then couldn’t citizens still encrypt their own data or turn to providers outside of U.S. jurisdiction? I’m not seeing the point of all of this if we’re only legislating how convenient it can be.
Providers are stuck in the middle of an argument that should be between government and the individual. It is time for lawmakers and everyone else who believes in mandatory backdoors to stand up straight, speak in a clear voice and call for an outright ban. Let’s talk about an encryption ban or maybe even a ban on privacy in general. Let’s publicly discuss the criminalization of any form of security that is “too good” for our own good. Let’s call it what it is and have the debate — in the open. And when it’s over, let’s see what kind of country we have left.
Chris Howell is the CTO of Wickr, a secure communications company focused on transforming the way companies and organizations protect high-target communications.

No comments: