A former Chief Technical Officer at Microsoft has proposed a solution to the FBI's supposed “going dark” problem — the use of secure encryption tools by criminals and terrorists. His solution is to provide a cryptographic back door that would be locked with a private key that the phone manufacturer would maintain and guard with the same fanatically obsessive security with which Apple or Microsoft guards its own update-signing keys.
“Cracking the Crypto War”
Steven Levy, Wired, April 25, 2018
Say the FBI needs the contents of an iPhone. First the feds have to actually get the device and the proper court authorization to access the information it contains — Ozzie's system does not allow the authorities to remotely snatch information. With the phone in its possession, they could then access, through the lock screen, the encrypted PIN and send it to Apple. Armed with that information, Apple would send highly trusted employees into the vault where they could use the private key to unlock the PIN. Apple could ten send that no-longer-secret PIN back to the government, who can use it to unlock the device.
Robert Graham points out several vulnerabilities in this scheme:
“No, Ray Ozzie Hasn't Solved Crypto Backdoors”
Robert Graham, Errata Security, April 25, 2018
The vault doesn't scale
… The more people and the more often you have to touch the vault, the less secure it becomes. We are talking thousands of requests per day from 100,000 different law enforcement agencies around the world. We are unlikely to protect this against incompetence and mistakes. We are definitely unable to secure this against deliberate attack. …
Cryptography is about people more than math
… How do we know the law enforcement person is who they say they are? How do we know the “trusted Apple employee” can't be bribed? How can the law enforcement agent communicate securely with the Apple employee?
You think these things are theoretical, but they aren't. …
Locked phones aren't the problem
Phones are general purpose computers. That means anybody can install an encryption app on the phone regardless of whatever other security the phone might provide. The police are powerless to stop this. Even if they make such encryption [a] crime, then criminals will still use encryption.
That leads to a strange situation that the only data the FBI will be able to decrypt is that of people who believe they are innocent. Those who know they are guilty will install encryption apps like Signal that have no backdoors.
In the past this was rare, as people found learning new apps a barrier. These days, apps like Signal are so easy even drug dealers can figure out how to use them. …
The FBI isn't necessarily the problem
… Technology is borderless. A solution in the United States that allows “legitimate” law enforcement requests will inevitably by used by repressive states for what we believe would be “illegitimate” law enforcement requests.
Ozzie sees himself as the hero helping law enforcement protect 300 million American citizens. He doesn't see himself [as] what he really is, the villain helping oppress 1.4 billion Chinese, 144 million Russians, and another couple billion living [under] oppressive governments around the world.
Ozzie pretends the problem is political, that he's created a solution that appeases both sides. He hasn't. He's solved the problem we already know how to solve. He's ignored all the problems we struggle with, the problems that we claim make secure backdoors essentially impossible. I've listed some in this post, but there are many more. Any famous person can create a solution that convinces fawning editors at Wired Magazine, but if Ozzie wants to move forward he's going to have to work harder to appease doubting cryptographers.
Matthew Green makes a similar case, even more persuasively (because his rhetoric is less heated until he reaches the climax of his argument).
“A Few Thoughts on Ray Ozzie's ‘Clear’ Proposal”
Matthew Green, A Few Thoughts on Cryptographic Engineering, April 26, 2018
The richest and most sophisticated phone manufacturer in the entire world tried to build a processor that achieved goals similar to those Ozzie requires. And as of April 2018, after five years of trying, they have been unable to achieve this goal — a goal that is critical to the security of the Ozzie proposal as I understand it. …
The reason so few of us are willing to bet on massive-scale key escrow systems is that we've thought about it and we don't think it will work. We've looked at the threat model, the usage model, and the quality of hardware and software that exists today. Our informed opinion is that there's no detection system for key theft, there's no renewability system, HSMs [Hardware Security Modules] are terrifically vulnerable (and the companies largely staffed with ex-intelligence employees), and insiders can be suborned. We're not going to put the data of a few billion people on the line [in] an environment where we believe with high probability that the system will fail.
There's a pretty good consensus among security and legal researchers with a variety of political perspectives. Here are some more examples:
“Ray Ozzie's Proposal: Not a Step Forward”
Steven M. Bellovin, SMBlog, April 25, 2018
“Building on Sand Isn't Stable: Correcting a Misunderstanding of the National Academies Report on Encryption”
Susan Landau, Lawfare, April 25, 2018
An exceptional-access system is not merely a complex mathematical design for a cryptosystem; it is a systems design for a complex engineering task. … An exceptional-access system would have to operate in real time, authenticate multiple law-enforcement agencies … ensure the accuracy of the authentication system and its ability to withstand attacks, and handle frequent updates to hardware, the operating system, phones, and more. The exceptional-access system would have to be flexible enough to handle the varied architectures of different types of phones, security systems and update processes. …
The fundamental difference between building a sound cryptosystem and a secure exceptional-access system is the difference between solving a hard mathematics problem … and producing a sound engineering solution to a difficult systems problem with constantly changing parts and highly active adversaries.
“Ray Ozzie's Key-Escrow Proposal Does Not Solve the Encryption Debate — It Makes It Worse”
Riana Pfefferkorn, Center for Internet and Society, April 26, 2018
The access mechanism Ozzie proposes would act as a kind of tamper-evident seal that, once “broken” by law enforcement, renders the phone unusable. Ozzie touts this as a security feature for the user, not a bug. …
This “feature” alone should consign Ozzie's idea to the rubbish heap of history. … Ozzie's scheme would basically require a self-destruct function in every smartphone sold, anywhere his proposal became law, that would be invoked thousands and thousands of times per year, with no compensation to the owners. That proposal does not deserve to be taken seriously — not in a democracy, anyway. …
It would give the police a way to lean on you to open [your] phone for them. “You can make it easy on yourself by unlocking the phone and giving it to us,” they might say. “But hey, we don't need you. We can go to a judge and get a warrant, and then we can just have Apple unlock it for us. … That would brick it forever, so you couldn't use your phone anymore, even after we gave it back to you eventually. You'd have to go out and buy a new one.”
The United States Department of Justice is continuing its doomed quest for an encryption system that simultaneously conceals texts from some people who should not have access to them and reveals them to other people who should not have access to them. They have begun to organize research teams and conferences to discuss ways of forcing or tricking people who want strong encryption into accepting weak encryption instead.
The new feature of this story is that some of the researchers who have gone over to the dark side are now identified by name: Ray Ozzie, formerly Chief Technical Officer and Chief Software Architect for (of course) Microsoft Corporation; Stefan Savage, Irwin and Joan Jacobs Chair in Information and Computer Science at the University of California, San Diego; and Ernie Brickell, Chief Security Architect, Intel Corporation.
The presence of Brickell and Ozzie guarantee that users should never trust encryption systems supplied in Intel hardware or as part of the Windows operating system, but should continue to use systems, such as
GPG, that are entirely implemented in open-source software.
“Justice Dept. Revives Push to Mandate a Way to Unlock Phones”
Charlie Savage, The New York Times, March 25, 2018
A research team at Boston University has discovered a technique for partially encrypting messages so as to make decryption extremely expensive, but not impossible. They call their partial-encryption system “cryptographic crumpling.” The computation required to decrypt a message prepared by this technique is useless in the decryption of any other message, so there are no economies of scale — the decryptor must pay the high computational price all over again for each new message.
The researchers offer their system as a way of resolving the “second crypto wars” between government officials, who insist that the makers of all commercial-grade encryption software must provide back doors for law-enforcement and national-security agencies, and privacy advocates, who insist that only strong, end-to-end encryption will protect their rights. The researchers argue that their system would allow well-funded government agencies to access the partially encrypted data in exceptional cases, but would force those agencies to choose their targets so carefully that the privacy rights of ordinary users would not be significantly affected.
But this proposal doesn't really accommodate either side. Government officials say they need to decrypt messages that could contain evidence of crime or terrorism regardless of how many such messages there are, and so would not be content with a system in which their budget constrains their ability to decrypt. And privacy advocates would surely note that if government officials with legitimate interests in the contents of communications were able to perform the decryptions, so too would corporations bent on industrial espionage, hostile foreign governments, and even well-funded hacking teams. A back door works equally well for everyone who has the resources to open it.
Such failed attempts at compromise reinforce the conclusion (which most security analysts reached long ago) that the requirements of government officials and privacy advocates are incompatible.
“Cryptographic Crumpling: The Encryption ‘Middle Ground’ for Government Surveillance”
Charlie Osborne, Zero Day, March 19, 2018