The encryption quicksand into which Apple is sinking
The encryption and security thinking used to be focused solely on protecting a customer's data from cyberthieves and other bad guys attempting to break in. It then morphed to also see law enforcement as the attacker, putting in various defenses to keep out municipal, state, federal and global investigators, sometimes on fishing expeditions for any hint of wrongdoing.
Companies like Apple—and where Apple goes, the tech industry almost always follows—are now adding a new enemy to its to-be-protected-from list: itself. The theory goes that if Apple's best engineering can't break into its own devices, government court orders will be irrelevant. Apple can't be made to do what it can't physically do.
But this raises an interesting legal question. What if Apple goes out of its way to spend millions of dollars to develop a way to not able to do something Is that intentional obstruction
Look at it from a different perspective. Let's say that a well-financed drug dealer specifically knew that law enforcement wanted to look at particular financial records that he possessed. What if he created a massive vault that, upon a specific voice command in its owner's voice, would delete all access codes and disintegrate the vault's contents (OK, if it could really disintegrate all of its contents, I suppose it wouldn't need to delete its codes. But drug dealers tend to opt for security redundancies.)
Could the act of creating such a vault and saying that command be considered defiance of the anticipated court order Would it potentially constitute contempt of court
We have discussed here why it's not a great idea to have the government dictating corporate encryption policies. But what the government is doing here is a lot more invasive than many people think. They are not asking for the encryption backdoor. They are instead asking Apple to use its engineers—at no cost to the government—to create security weaknesses that the government can exploit. Specifically, they are asking for a removal of the limit on the number of bad password attempts before the system locks up, as well as the removal of time limits between break-in attempts. With those two items gone, brute force attacks will inevitably be able to crack into the phone.
Mark Rasch is a former federal prosecutor, former head of the U.S. Justice Department's high-tech crimes unit and currently serves as the chief security evangelist for Verizon. Rasch argues that Apple—and anyone who follows them—are in the clear legally.
Here's the legal bottom line, from Rasch's perspective, which is not necessarily in accordance with the legal interpretations of current Justice Department lawyers or lawyers for the next administration's Justice Department.
Rasch's position: A court-ordered search warrant only applies to law enforcement. A search warrant to search a suspect's home allows law enforcement to search but does not require the homeowner to cooperate. The homeowner, Rasch argues, is fully within his rights to say, "Thanks, but I choose to not let you into my house." Law enforcement then has the right to smash the door in, but that order doesn't obligate the homeowner to do anything. A court order to a civilian would have to happen under the All Writs Act. The question then goes to whether the court has that power.
"Not cooperating and obstruction are different things. Obstruction in advance doesn't exist," Rasch said. "Nor does 'aiding and abetting a crime by not making it easy for me to solve crime.'" The closest Apple analogy would be if police had a valid search warrant and were trying to break into a suspect's home, Rasch said. The suspect in this scenario had so expertly reinforced all of the doors, windows and walls that police equipment—including battering rams—were ineffective.
What if police turned to a construction worker walking down the street and said "We want you to spend as many weeks as it takes to figure out a way to break into this house" Even if the police offered to pay the worker a fair rate, does he have the obligation to comply What if he doesn't want to spend weeks doing this "The government wants to take the labor of Apple engineers without just compensation," Rasch said. And even getting just compensation—which is irrelevant in this case because "Apple doesn't want to get paid to do this"—is tricky.
"Apple doesn't want to take invading a customer's privacy and to turn it into a profit center," Rasch said. "And the government doesn't want to establish the precedent of having to pay people to comply with a court order." In short, as long as Apple doesn't do this to circumvent the search for a specific criminal act that they know about in advance (and no one has suggested that they are), crafting a way to lock themselves out permanently is legally sound.
Alas, it's not that simple. Consumers are maddeningly self-contradictory. They love the idea of Apple not having access, so that they Apple cannot violate their privacy even with a search warrant. But they hate the idea of Apple not having access if the consumer forgets his/her password and can't simply reset it. They want access to all of those photos and messages and videos no matter what and they expect Apple to be able to do that.
The typical response to sidestepping forgotten passwords is to go for biometric authentication. In theory, a consumer can't "forget" their retinas or their fingerprints. That is also not a perfect solution. What if the phone suffers some corruption and the phone can no longer match the consumer's biometric self with the phone's file Again, they expect Apple to be able to swoop in and help.
That is the real problem. If Apple creates the perfect defense against itself, it can't comply with urgent requests from the government or it's customers.
Don't forget that in this specific case, the government wants to break into the iphone of Syed Rizwan Farook, one of the killers in the San Bernardino, Calif. shooting rampage. And his phone was not owned by him. It was owned by the county agency he worked for, and the county government—the true owner of that phone—has consented to the phone being searched.
That means that there is no privacy issue in this immediate case, although it's a certainty that privacy will crop up in other cases.
Rasch points out that had the county's IT folk been using a mobile device management product on county phones, this entire issue would have been avoided as the county would have had the employee's password. The question then comes back to Apple—and other technology players—and how far they are willing to go to thwart government inquires (to protect customers) when such efforts will also block those customers if they get accidentally locked out of their phones.
It's akin to CIOs who want no one to be able to get into an enterprise network without proper credentials, unless they are lost, in which case they want their vendors to be able to override and get in.