If You Criminalize Security, Only Criminals Will Be Secure
Recently, U.S. law enforcement officials have re-energized their push for a technical means to bypass encryption, pointing to Symphony: a chat application implemented by banks in 2015 which includes some backdoor access functionality to be utilized in the case of an investigation. This system, officials argue, could be implemented more broadly to provide for secure backdoor access to all communications for law enforcement. They further claim that they have spoken with technical experts (without providing any names), who claim that this approach is possible to implement more broadly. Publicly, security experts have expressed skepticism as to whether a Symphony-like system could be safe, particularly if scaled out the way officials seem to want. Policy experts have further explained that these conversations are increasingly providing political cover for bad laws and practices in other countries. For instance, in the last few years China has passed a range of laws with serious implications for human rights. As a result of these laws, Apple has now announced that it will shift to contracting with a Chinese company for provision of its iCloud service to Chinese users. This means all iCloud data – including encryption keys – would have to be stored locally.
The Apple example is a timely reminder that the ongoing debate over encryption backdoors has important global implications. Additionally, it provides a concrete example of how even the staunchest advocates for encryption can buckle in the face of powerful government actors. Everyone on the Internet has a shared interest in safe encryption, and so any system that would undermine security should be a non-starter. However, even putting aside these technical debates, and granting the dubious assertion that a backdoor could be implemented securely, there are a number of practical reasons why backdooring encryption would still be a bad idea.
Just who are the “good guys”?
Any system mandating some hypothetical secure backdoor would need to decide who can obtain access and under what circumstances. Even if it were possible to build a backdoor which only “good guys” could use, this would still leave tech companies in the position of having to determine who counts as a “good guy.”
To American lawmakers, it may seem logical to require that access should be provided to U.S. law enforcement in line with U.S. law, including, at a minimum, the receipt of a valid warrant. However, digital communications are international, as are the tech firms at the center of this debate. If Apple, for example, had the technical capability to circumvent encryption on their devices, and they had a policy of facilitating access for U.S. law enforcement when presented with a warrant, they would face tremendous pressure to provide equivalent services to other governments, and, in some cases, like China, even legal obligation to do so.
While it may seem acceptable enough for a company to cooperate with warrants in friendly democracies with independent judiciaries, like Canada or Germany, a request from the Chinese police to help track dissidents, or Saudi police to catch anyone using Grindr, would place the companies in a difficult position. Many repressive states, including China, already have laws requiring cooperation from the private sector. Apple’s acquiescence to relocate its Chinese users’ data into China was made in response to new legislation requiring local storage. While Apple has indicated that this will not undermine security for users, experts have pointed out that that is not a guarantee they can make. The fact that tech firms are unable break their own security features limits states’ ability to make demands for access too user information, though this would certainly change if a backdoor were introduced for American law enforcement.
If a tech firm introduced a backdoor into its systems, it would therefore have two options: it could facilitate access to all governments equally, which would mean complicity in a wide range of human rights abuses, or they could commit to evaluating all requests for access on their merits and potential human rights impact. In the latter case, besides being manifestly unqualified to perform this role, such a stand would be very difficult for tech firms to maintain. The mere capability to facilitate backdoor access would subject companies to tremendous pressure, and a failure to comply would have high stakes: Chinese law has no upper limit on the fines they can charge for non-compliance with government demands for access. A non-compliant company could also risk losing access to that market entirely, or even seeing employees jailed or harmed. Given these alternatives, it is not surprising that tech firms have thus far sought to avoid these results by maintaining a technical inability to provide access to their users’ secure communications. The real costs of backdoors would be born by ordinary people, and global tech firms who will have to shoulder the financial and moral cost of supporting repressive governments.
Just how secure is your “secure” backdoor?
While government officials may seem comfortable with the idea that exceptional access can be provided securely in line with today’s technology, they fail to examine the future implications. Major data breaches are already happening with alarming frequency, with compromised credit card numbers or social security numbers for sale online for only pennies. While encryption cannot prevent all breaches, it may well be our best defense against the activities of malicious actors, something that will only get more important as the Internet of Things continues to expand. Intel has forecasted that there will be more than 200 billion IoT devices in use by 2020, including cars, smoke detectors, and home security systems, most of which will generate sensitive user data. In a connected world, digital security will likely become a matter of life and death. The next generation of breaches may threaten not just your data, but your life, and those of your loved ones.
And the techniques of these actors are getting increasingly more sophisticated. Offensive strategies to gain access to lucrative data sets are constantly evolving, making it a constant race for those in control of those data sets to stay just a half step in front of those trying to find a chink in their systems. Any limitations on the strength or type of encryption that can be used to protect data would constitute the equivalent of tying one hand of every engineer behind his or her back as they work to protect user information.
Complicating matters further, governments from the U.S. to China are already making strides toward quantum computing, which could potentially break any encryption used today. And, while these breakthroughs often stay in the exclusive use of governments for short periods, inevitably they trickle down to non-state actors. To protect everyone, companies must be incentivized to constantly pursue better and stronger forms of protecting data if they have any hope of being prepared to face evolving generations of would-be criminals.
Government can legislate, but the market will respond
It is nearly impossible to keep people from accessing certain tools or technologies online. In 2017, Russia passed a law which banned the use of Tor, a program for facilitating anonymous, encrypted communication, but as of early 2018 there were still over 250,000 daily users of the service in Russia. Despite having been sued repeatedly, and having their founders jailed, The Pirate Bay website remains stubbornly online. As long as there are countries on earth which do not mandate backdoors for encryption, or do not impose these rules universally, strong encryption is going to remain available.
The mostly likely result of a move to introduce backdoors into tech products would be a migration by criminals and terrorists to smaller, less regulated products which could offer strong encryption without consequence. The use of encryption tools is quickly becoming a skill for criminals, much like hot-wiring a car or buying a stolen credit card. Eventually the only people who would still use tools subject to the government mandate will be ordinary people without the knowledge or incentives to adopt other tools, and for those people the repercussions of weaker security could be serious and long-felt. This is one of the main reasons why experts agree that it is unlikely that any access mechanism, even if immediately effective, would maintain its efficacy over time.
Building and securing the architecture to ensure government access would also constitute a significant cost on companies. The level of sophistication involved would be a particular challenge for smaller firms, which would inevitably lead to their products being less safe. This insecurity would be compounded if different governments mandated different access mechanisms, with users bearing the weight of any breach of their sensitive information and communications. Accordingly, mandatory encryption backdoors can be seen as a market-distorting force, making security so expensive that only the biggest companies could offer it. While this impact could be mitigated by limiting a rule requiring backdoors to only apply to companies over a particular size, that would in turn further limit the mechanism’s efficacy, and anyone who needed security for illegitimate ends would presumably migrate to services outside the scope of the rule
Why is this necessary again?
Despite government officials’ decades-long push to move forward with proposals for building a backdoor into encrypted systems, we still have not answered the basic threshold question: would a backdoor access regime even give law enforcement the information they are seeking? Even more important, we don’t know what it is that law enforcement is seeking. While some numbers have been published about the number of encrypted devices in custody of U.S. officials, these data are not useful. Was the encrypted data critical? Could it have been accessed in other ways? What is the likelihood that any encrypted information would have actually contributed to the case? What cases are we looking at, and what is the technical sophistication of the criminals involved?
Law enforcement claims that modern criminals are increasingly sophisticated, and warn that the spread of encryption means they are “going dark”, harming law enforcement officials’ ability to do their jobs. The reality is that the digital world has provided investigators with a vastly more sophisticated toolkit for solving crimes than ever before. A generation ago, tracking down who a suspect was communicating with and what they were saying involved having law enforcement agents physically follow them around, or break into their establishments to plant remote listening devices. Today, not only is that information available in a consolidated format, and capable of being conveniently delivered to any field office, in many cases it can be traced back for months or even years depending on the policy of the company handling the communications.
The challenges posed by security conscious criminals are hardly unprecedented. Indeed, people have always had ways of rendering information inaccessible to investigators – including simply burning or burying incriminating material. The fact that information is now encrypted does not represent an unprecedented challenge for law enforcement; it merely represents a slight retreat from the “golden age of surveillance” that we currently live in.
Moreover, even sophisticated encryption systems are not a black hole for criminal data. After the 2015 San Bernardino attack, the FBI tried to compel Apple to build a new operating system that could be installed on a phone belonging to one of the shooters, bypassing its security features. At the time, the Bureau claimed that Apple’s cooperation was the only possible way its investigators could get access to the contents of the phone. After Apple refused, however, the FBI found a researcher to hack into the phone for them. Today, at least one company likely has the ability to compromise any iPhone model on the market. Additionally, American law enforcement agencies recently sought – and received – an update to criminal procedure rules to more easily facilitate their hacking operations (despite having no law that authorizes such activity or provides adequate protections), and are using malware to gain access to information on encrypted devices. In other cases, police are reverting to more traditional investigative methods which, while time consuming, are still effective. Russian spy Anna Chapman encrypted her information, but relied on passwords which she wrote down on paper, and which were later found by a police search. In order to catch Ross William Ulbricht, the alleged head of the illicit Silk Road market, investigators waited until he had logged into his computer, before snatching the device away. Such “legal muggings” have become a standard part of the investigative toolkit in the UK. If these techniques are good enough to catch professional spies and international drug traffickers, surely the law enforcement’s fears of “going dark” cannot be as dire as claimed?
Focus on the future, instead of trying to recapture the past
While the technical debates about whether encryption can be securely backdoored are interesting, they only represent one part of the argument for why these proposals are a terrible idea. The use of encryption technologies has been widely recognized as a core component of freedom of speech, as well as the right to privacy. While these rights are not absolute, a broad mandate requiring backdoors that would impact anyone other than specific targets in investigations would likely be considered a violation of human rights standards. Moreover, in the United States we legally recognize code as speech, and therefore compelling companies to create and implement new access mechanisms would violate their own First Amendment rights. It seems unlikely that any mechanism can – or should – survive scrutiny under these standards.
Fundamentally, we all have an interest in a safe and secure Internet. Even U.S. intelligence agencies have fallen victim to bad security, most notably with the Shadow Brokers data breach. This may be why, despite arguing against the dangers of widespread encryption, the U.S. government remains a major financial backer of Tor, largely because of a belief in the national security value of enabling secure communication. Modern technology is fluid and fast moving. To keep up, law enforcement needs to adapt. But seeking to undermine encryption only looks backward instead of focusing on where technology is going. We should be having conversations about new investigative techniques, not trying to preserve the access enjoyed in the days before encryption was so widespread, particularly when so much is at stake in doing so.