The question of precisely what the law permits the government to do is a different question from whether we think it is a good thing that the law permits the government to do what it does. Our intuitions and considered opinions may diverge on these two questions.
Keeping that distinction in mind, let’s look at the order issued by a magistrate judge in Riverside, Ca., this month which required Apple to provide technical assistance to the FBI in unlocking the phone of one of the San Bernardino terrorists. The government has an iPhone 5, which it legally seized pursuant to a valid search warrant. The phone belonged to one of the terrorists, and the law allows the government to search that phone for evidence related to crimes.
There is no dispute about the government’s authority to search that phone, as thoroughly as it wants to.
But the government cannot search the phone, because the phone’s security is better than the government’s security-breaking tech. The phone has a feature, familiar to many Apple users, that wipes the phone of all data if the password is entered incorrectly 10 times. The government wishes to get past the password on the phone by using the “brute force” method of connecting the phone to a computer which will randomly enter thousands of passwords until it hits the right one. But the phone’s auto-wipe feature will defeat that method.
Now, the government is perfectly free, under existing law, and under all of our intuitions about what is politically and socially appropriate, to send that phone to some secret lab buried in the Utah desert where a team of young, geeky, yet somehow telegenic computer techs will place it on an elevated platform in the center of a high-tech lab and analyze it from all angles while making witty small talk but each other’s romantic lives.
But the government has conceded that its team of techs can’t solve the puzzle. So the government wants to force Apple to use its own team of young, geeky, yet somehow telegenic computer techs to break into the phone.
And the government wants Apple to do so via a very particular means. It wants Apple to write code that will update the phone’s operating system while the phone is locked in a way that will disable the auto-wipe function. The phone will only do that if it recognizes an Apple encryption key that is “fused into the phone itself during manufacture,” according the FBI agent’s declaration filed with the government’s motion.
But Apple does not have any such system-updating, auto-wipe-disabling code sitting around ready to use. And there’s a good reason why not: such code would eviscerate the very security which is at the heart of Apple’s product and corporate identity.
Apple does not want to do what the government requests. So here are our two questions: First, can the government compel Apple to do so under existing law? Second, would it be a good or bad thing if the law permitted the government to compel Apple to do so?
I have carefully examined the case law cited by the government in its filings. (For those interested in the procedure at work here, the government filed its application for the order ex parte, meaning Apple had no chance to respond. The court issued the order, but gave Apple five days to file an opposition. Before Apple had the chance to do so, the government jumped back in with a motion to compel Apple to comply with the original order. I would consider that a bit bullying if the opposing side wasn’t the richest company in world, and hadn’t just hired Ted Olson as its lawyer.)
The 1789 All Writs Act
The government’s argument is that the All Writs Act, from 1789, allows courts to compel third parties to assist in the execution of valid searches. The government has cited eight such cases (and because this is such an important issue for the FBI and the Justice Department, you better believe they looked hard). Of those eight, only two involve gaining access to locked cell phones. One of those, from 2014, does not identify the type of phone, or the type of assistance that the government requires. The published order simply states that the government seeks assistance in “bypassing the lock screen.”
The other one, from 2013, orders Apple to assist the government in obtaining data from an iPhone 5. Interestingly, the 2013 order (the case is called Navarro) specifies that if the data Apple recovers from the phone is encrypted, Apple has no obligation to “enable law enforcement’s attempts to access any encrypted data.”
I find this interesting, because it suggests at a minimum that there are limits under the All Writs Act to the extent to which the government can compel third parties to help it obtain access to something that it has a valid warrant to get. Remember, in all of these cases it is undisputed that the government has a valid warrant to whatever data is stored is on the phone. And if that data is encrypted, it would certainly be advantageous to the government to get an order compelling Apple to decrypt it. If the government thought it was entitled to an order compelling Apple to decrypt the data, it would have asked for it.
And if the All Writs Act were really a blanket license for the government to obtain court orders compelling a third party to do whatever it needs to help effectuate a search warrant, then I see no reason why it should not also allow the government to obtain an order compelling a third party to help it decrypt encrypted data. So if that’s a line that the All Writs Act case law has not crossed, we need to ask why.
One possible doctrinal explanation (that is, an explanation of how the existing legal precedent justifies a result) is that the All Writs Act case law contemplates a balancing analysis in which the government’s need for the third party’s cooperation is balanced against the burden on the third party. The Supreme Court created that balancing test in 1977 in the seminal modern All Writs Act case, United States v. New York Telephone Co., 434 U.S. 159 (1977).
The Navarro court may have thought that compelling Apple to break its own encryption was too much of a burden on the company. If so, that burden must be something other than the technical difficulty of the process.
In the San Bernardino case, the government argues in its motion that there is no unreasonable burden on Apple because Apple could create the required code with little time and effort, and that the government would reimburse Apple for its costs.
When Apple files its opposition, I expect (based on Tim Cook’s February 16 customer letter) that Apple will argue that the government is missing the point: the relevant burden is not the time it will take Apple to break its own security. Rather, it’s the act of breaking the security itself, and the effect that doing so will have on millions of other iPhone users, and thus on Apple itself.
So if the burden on Apple derives from the fact that the company’s corporate and product identity is largely tied to the strength and security of its encryption and other privacy protections, is that a burden that should matter in the analysis of whether it should be compelled to help the government search a phone? This question is important because it has never been addressed or resolved, whether by the courts or (as it needs to be) by Congress. So we should be suspicious of statements by government officials that the law obviously compels Apple to comply here.
Would Apple’s New Code Remain a One-Time Exercise?
If the burden on Apple is conceptualized as I suggested above, and as I expect Ted Olson to articulate much more eloquently over the coming weeks, then the All Writs Act precedents don’t do much work for the government, because those cases didn’t present this unique type of burden. As noted, the government wants Apple to write new code (including the company’s own uniquely encrypted signature) that will allow the phone’s operating system to be updated remotely while the phone is locked, disable its security features, and open it up to a “brute force” password-guessing attack. Once such code exists, it’s like the Indominus Rex: no one can credibly argue that it will stay safely in its enclosure.
The government makes light of these concerns, dismissing them as “marketing strategies.”
Maybe the government is right. But I think it’s vital that people understand that the government’s not obviously right: The All Writs Act, as interpreted by the courts, does require a balancing analysis including third-party burdens, and Apple is claiming that forced compliance will undermine its corporate identity, its—yes—marketing strategies, and the data security of millions of its customers. If the government’s answer is “So what?”, then I think it’s the government’s burden to show why the law shouldn’t take such burdens into consideration. And it has not done that in its briefing.
Now, the more general question: Legal precedent aside, is this the sort of thing we think the government ought to be able to do? In other words, if a private company creates a private secure storage medium that the company itself cannot access, should the government be able to force the private company to create a means of access to the storage medium?
I have to say that while I hold James Comey and Loretta Lynch in the absolute highest regard, I disagree with them on this point.
First, the government has plenty of power as it is. The government already has the absolute right to use its unlimited resources however it chooses to crack any storage device it seizes pursuant to a valid warrant. Are we so sure that’s not enough?
We certainly don’t think that the government has a general power to compel private-sector assistance whenever government resources fall short, do we? If Joe Rookie federal prosecutor is getting his butt kicked in court by Mark Geragos (an awesome private-sector trial lawyer), the government doesn’t get to commandeer John Quinn (another awesome private-sector trial lawyer) to come in and take over the trial to even things out. In other words, the fact that the private sector is better than the government at something is not, in and of itself, sufficient grounds for the government to be able to command the private sector to go to work for the government. .
Second, I am not persuaded that Apple will precipitate a national-security crisis by creating a storage device that no one can break into.
Readers should know that I am generally center-right on national security and law-enforcement issues (I have a long paper trail that anyone can follow and that I’ll proudly defend in any confirmation hearing). Maybe I’ll be proved wrong, but I think the burden of persuasion here should be on the government.
Think about it: the only information the government can’t get here is information that is stored on this phone and nowhere else. Anything backed up to the cloud, the government can get. Anything emailed, the government can get. Anything on social media, the government can get. Anything written down on paper, the government can get. And anything told to a witness who can be persuaded to cooperate, the government can get.
Third, and relatedly, since the beginning of the Republic, we’ve had storage containers that the government can’t break into. They’re called our minds. It would always be a huge benefit in any investigation if the government could hack into a suspect’s mind and get at the information inside. But we don’t allow that. Our shameful experiment with “enhanced interrogation” in Guantanamo is all the reminder we need of why the basic prohibition against compelled self-incrimination is in the Constitution.
So there are simply some things a civilized society does not do. Extracting information from suspects by torture is one. How about extracting information from a phone by forcing a private company to create software that potentially imperils the privacy and security of millions of its consumers?
Is that another? I think it could be. And I’d rather have our legislatures debate the issue and pass a law one way or another. If we want to make end-to-end encryption illegal, we should do so explicitly. If we want to bar companies from selling hack-proof information storage containers, we should do so explicitly. We have not done so yet. And I don’t think it’s obvious that we should want our government to have this power, just as it’s not obvious that this power is within the scope of the government’s authority under the All Writs Act.
(Finally, it’s worth noting that Apple may design future phones so that they cannot be updated at all while the password screen is locked. On such phones, the approach requested here wouldn’t work, and Apple could argue that it literally cannot break into the phone. Thanks to Orin Kerr, always required reading on computer-search issues, for this point.)
Caleb Mason is a regular contributor to TCR’s Viewpoints. He is a partner at Brown White & Osborn in Los Angeles, and a former federal prosecutor. He welcomes comments from readers.