How FBI vs. Apple could cripple corporate and government security
As the discussion focuses on privacy and crime, what is mostly lost is an analysis of the potential business and government implications—not merely the impact to Apple, technology vendors, and law enforcement agencies, but the effects to the wider business community and daily operation of thousands of agencies at all levels of government. Taken from that point of view, the President’s statement could become, “… it’s fetishizing the investigation of a limited set of highly serious crimes above every other value.”
Day to day I work as an IT security industry analyst. Formerly a research vice president at Gartner, where I was the lead analyst for datacenter encryption, I now run my own firm. For the past 15 years, I have advised some of the largest companies and government agencies in the world on using encryption systems. I’ve written multiple research papers, and I continue to work with most of the major encryption technology vendors.
Knowing how encryption is used throughout the business world, it is clear that one of our most fundamental security tools is at the center of a civil rights debate, and the slightest misstep could set back corporate and government security by decades.
Encryption is ubiquitous in the digital world. We use it for every credit card transaction, every time we unlock a car with a key fob, every time we log into nearly anything with a password, visit a secure website, connect to a wireless network, update software, or do pretty much anything with a bank. Society relies on encryption for far more than merely protecting our phones and online chats.
Encryption is merely math, not sorcery. It is a heavily studied field of math with an extensive body of work in the public domain. The U.S. government once restricted the export of strong encryption products, forcing companies to use weaker versions overseas and support the weaker encryption here at home since the Internet doesn’t respect national boundaries. It’s a decision we still pay the price for daily, as earlier this year researchers discovered yet another vulnerability in about a third of the Internet directly due to this deliberate weakening back in the 1990s.
The fight was known as the Crypto Wars, and the government, under President Clinton, eventually relented. Those attempts at control did little more than weaken the security of products and businesses. An encryption algorithm isn’t a nuclear centrifuge, and when all you needed to do was print source code for software in a book and ship it overseas for someone to scan into a computer and compile, the idea of restricting a bit of math to a national border became farcical. Especially when that math was already legal and public.
The U.S. government backed down on the battle for encryption because it was essential to running businesses and government services over the Internet. Attempts to allow encryption outside the country only in a weakened state left everyone vulnerable to attack since domestic systems also needed to support the lower security levels. The remnants of those early attempts are still having repercussions decades later.
Even without restrictions on encryption, the proper implementation is difficult. When I authored a paper on defending enterprise data on iOS 7, I had to describe how to best work around Apple’s incomplete encryption—the very holes that started this debate, and were later closed in iOS 8.
The Department of Justice, in their latest brief, states, “This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that the government cannot search them, even with a warrant.” That statement is an outright falsehood disguised as wishful thinking. Improving the encryption of iOS 8 was a security decision, one lauded by IT security departments everywhere, who had long been encrypting laptops to an equal standard.
In his South by Southwest speech, President Obama stated, “I suspect the answer will come down to how we create a system where the encryption is as strong as possible, the key is as secure as possible, it’s accessible by the smallest number of people possible, for the subset of issues that we agree is important.”
There are existing techniques to enable third-party access to strongly encrypted systems. One widely used method uses an alternate key to decrypt data. Businesses will often support more than one key for a piece of data or a computer for various reasons, such as ensuring an IT department can still recover a corporate system if an employee tries to lock them out.
Apple and other technology providers could use this well-known method to allow government access to systems. The truth is this can be done relatively securely. We know how to keep incredibly sensitive encryption keys secure. It typically involves multiple people holding only fragments of the total key, extensive physical security, and non-networked systems. Ignoring the international privacy considerations, and the impact on these technology provider’s international business operations, if such a system was created and used in rare circumstances, it is highly unlikely to be broken.
The problem is it is impossible to scale this kind of system. First, if the FBI truly wants to eliminate warrant-proof (properly encrypted) storage and communications they would need the key for every encrypted product and service on the Internet. They would need highly-secure mechanisms for every software developer and hardware manufacturer to provide their keys. Since that is completely unworkable, perhaps only major manufacturers and developers over a certain size would have to participate.
Then there’s the issue of access. Does only the FBI get to use the system for terrorism cases Do local law enforcement officers get access to catch child predators Drug dealers Could this be limited only to the U.S. Or would other countries, including ones, like China, that the U.S. government itself publicly accuses of hacking corporate systems, also gain access or require their own alternate keys These are legitimate and complex questions, not mere aggrandized slippery slope arguments. The more access there is to a key, the more often it is used, the less secure it is by definition.
Ignoring the privacy concerns, the impact on business and government systems (and thus operations) could become crippling.
When I advise companies on properly encrypting laptops, aside from the complexities in key management, I have to guide them through all the potential weaknesses. For example, I tell them if they are crossing certain international borders or keep highly sensitive information on a Mac they might lose physical control of to ensure the system is always shut down, not put to sleep, because encryption keys are often stored in nonvolitile RAM, leaving the Mac vulnerable.
This isn’t paranoia. We know for a fact that certain governments hack corporations (and other governments), and a stolen laptop can be a great source of information. The same is true for industrial espionage (it’s real) or targeted criminal attacks. Corporations spend many millions of dollars to secure mobile computers using enterprise encryption software, and millions more on managing secure phones and tablets.
If the FBI mandates alternate decryption keys for all devices, those keys would potentially need to be generated for all corporate systems, not just consumer phones. If such a law didn’t apply to laptops, that would be an easy way to skirt the requirement. If it does, then the government gains direct access to all those systems, and complex key-exchange mechanisms would need to be created and every business or government agency that encrypts would have to provide recovery keys.
Then how would companies handle international operations Or international companies with workers in the U.S. This is before we even get into the issue of other nations requiring their own access keys. One outcome could be that internationally encrypted devices are inaccessible by the U.S., and U.S. systems are safe in other countries—unless the governments cooperate in major cases and exchange evidence, which isn’t unprecedented.
If the scope is limited to just phones, and only in the U.S., and only for terrorism and a few other cases, the risk and burden to U.S. companies would possibly be manageable. But based on the stated objectives of the FBI and President Obama, it is reasonable to assume the scope is wider, and it is hard to imagine that only the U.S. would mandate a golden key, and only for phones. Even without some malicious hackers stealing the keys, the end result is corporate devices, especially those used with international travel, could no longer be considered secure in many real-world situations.
In previous statements, FBI director James Comey also expressed concern with encrypted communications, like iMessage, where the government can’t access the key. Businesses depend on secure communications on multiple levels, ranging from employee communications to secure transactions with partners and services.
With some of these systems the government can mandate backdoor access, forcing the provider like Apple or Facebook to keep records of communications, or at least have the ability to sniff communications when required.
But not all these systems are centralized. Enterprises commonly set up their own hosted communications systems since they don’t trust an external service providers or for regulatory reasons. If a tool like iMessage requires access, what about VPNs Secure connections to websites and email servers Secure messaging systems Secure file transfer systems Financial transaction systems that run over the Internet
All of these rely on the exact same set of foundational technologies, and all are abused by criminals every day. Worrying they may be within regulatory scope isn’t much of a mental stretch.
There are thousands of systems and technologies out there, and few lines between those used by businesses and the general public. If the bad guys switch from the providers known to work with the government to the open source and commercial technologies used by business, those systems will likely also have to support government access. That means backdoors and recovery keys, since there isn’t any known alternative.
This brings us back to the same problems we have with devices. We simply don’t have scalable mechanisms to support lawful access without reducing security. There is a very real risk that secure communications on multiple levels could be deeply compromised and result in real criminal losses. And that’s before we start worrying about foreign governments.
The strongest encryption in the corporate world isn’t found in phones, but in data centers. Enterprises commonly use specialized security appliances designed as unbreakable safes for encryption keys and operations. These Hardware Security Modules, or HSMs, secure banks, retailers, and even your iCloud Keychain backups. Access requires smart cards (sometimes multiple cards held by different employees), and physical tampering can trigger failsafe deletion of all the stored keys.
If you don’t want to buy an HSM, you can always rent one from one of multiple major cloud providers. They aren’t cheap, but provide the ultimate in security since not even the cloud provider can access your data.
That’s merely one example of the strong encryption tools absolutely essential for secure data centers and applications. This equipment and these tools aren’t the kinds of things you can pick up at Best Buy, but they are certainly within the budgets of terrorists and a range of criminals. They are more secure than iPhones and can easily be used to build storage and communications systems. We use them for encrypted financial and medical databases, secure file storage, or even to keep those little CVV codes on the back of your credit card safe.
If these tools remain legal for enterprise, the odds are they will be used by nefarious groups to avoid government monitoring of consumer tech. If businesses are required to add back doors and golden keys too, we once again undermine the foundation for digital security.
The President and the director of the FBI have portrayed this conflict as one between privacy absolutists and government compromise. The issue is that the technology itself forces us to make a binary decision. There are no known techniques for providing lawful access to encrypted communications and storage at scale. The only way to allow government access is to reduce the security of foundational technologies used by business and government agencies, not merely individual citizens. That is math, not politics.
Further complicating the situation is that security constantly evolves, and we continue to adopt ever stronger technologies in more situations simply to stop the criminals, including hostile governments. These aren’t outlandish movie scenarios; they are the painful, expensive reality for every business in the world. The only difference between consumer, corporate, and government technologies are the price tags. Restrictions on these improvements could be catastrophic.
Last July a group of extremely well respected cryptographers published an excellent overview of the feasibility and security impact of government access. They concluded:
Everything in my experience supports their findings. I can’t think of any way to allow government access for criminal and national security situations that wouldn’t undermine the foundations of digital security across the board. Even ignoring the massive complexities if these requirements were instituted globally, unless the government required access to every possible encryption technology, it would be trivial for criminals and terrorists to hide, while dramatically increasing the risks to nearly all businesses and government agencies.