Any opinion here, which I have tried to minimize, is just my own and not that of the National Center for Supercomputing Applications or the University of Illinois where I work. I am merely trying to bring the perspective of a security professional and CISO who thinks in terms of risks and not absolutes to the discussion, which has to date been filled with superlatives. While some components of the debate are truly binary, the debate as a whole is not.
- Privacy or mass surveillance. We give up a lot more privacy just by virtue of having a smart phone. And this case, at least in the short term, is about targeted interventions that would require Apple’s assistance on a case-by-case basis.
- Backdoors. A general backdoor capability is not being requested by the FBI. You can tie a software update to a device ID and by digitally signing the software, you make it something the FBI and others cannot tamper with before loading. Apple is not being asked to give up their private signing key, at least yet.
- Going dark. This is not about “going dark” as the intelligence and law enforcement agencies have been saying. Police didn’t even have this treasure trove of data a decade ago to “lose”. And as many have pointed out, we are in a golden age of surveillance.
- Terrorists and pedophiles running amok. People have always taken secrets to the grave. People have always been able to destroy evidence, burn records, etc. This is not new, and this is not a binary question of having or not having evidence anymore.
- Unhackable black boxes. Information security has never been harder. There are always vulnerabilities, especially to targeted attacks. If the target is still alive, you can trick them into downloading malware, monitor their keystrokes with tempest technologies, or even tamper with the device while it is out of their control. Even in this specific case it seems there are likely other approaches to attack the hardware.
- Full device encryption. Full disk encryption has been available for two decades, and it has been built into major operating systems for at least a decade. If I am dead and my laptop is off, good luck getting that data. Even backup solutions support strong encryption now.
Now it would be disingenuous to say nothing has changed at all for either tech companies or law enforcement (LE). The first change is that LE’s free ride may be over. Phones are becoming more like laptops with high-grade security for data at rest. When smart phones first came out there was a period of having all the data sources of a typical PC, without any way to really protect it like a computer. No doubt it would be frustrating to lose that.
Second, high-grade security is on by default. This still isn’t true of your laptop, even though it is trivial to turn on. Recent changes by Apple have made this so on their mobile platforms, and this is very good for the many people who lose their devices each year. However, it is sure to be a frustration to LE as much as it is to criminals.
Third, in a post-Snowden era US tech companies are less trusted internationally and the citizens of the U.S. trust their own government less. This threatens our tech companies’ business and a key sector of the U.S. economy, and so they are responding to this threat. Devices that manufacturers cannot attack themselves make them more attractive, especially in international markets.
There are lots of angles to discuss in this whole saga: legal implications, motives, politics, economics, etc. I would like to focus on what the real risks are and actual trade-offs. We can’t make informed decisions without understanding these, and understanding these trade-offs may offer a compromise that mitigates some of the risks.
Evidence is lost. There is no doubt that strong encryption will have an impact on investigations, particularly for the deceased. We have seen this happen with laptops and computers used by criminals for at least two decades. However, we should be reassured that this doesn’t seem to be a dead end too often. A good reason why is that we are also in the golden age of surveillance. We have more devices, more accounts, more network traffic than ever to analyze for relationships. Our digital footprint is nearly impossible to stomp-out, even for the security professional, and it isn’t getting easier. We need to get away from the worst case scenario of the terrorist or pedophile that might go free. That just pushes this conversation into hyperbole.
We create poor, reactionary legislation. President Obama is quite correct to cast this as a risk to the tech industry. Pushing back as Apple has could force legislative answers, and these could easily make things worse and not better. The history is not good here.
A backfire effect. By pushing Apple so hard in this case, the FBI is incentivizing them to create even more secure phones. It would not be hard to make their security enclave wipe its keys before allowing a firmware update. And if Apple does this, it is likely that competitors will do likewise to compete on security features. This could instigate a stronger ecosystem of phone security, much to the chagrin of LE. It could be argued that we got to this point as a reaction to the actions of the U.S. intelligence agencies and LE. Now Congress could potentially address this in legislation, but even then they cannot control international manufacturers.
All writs slippery slope. We don’t really know where this ends if tech companies can be compelled to trojan their own software or devices. Will they have to do it in secret? How pervasive would that be? Would they be forced to do such updates over-the-air and remotely? Citizens have a reason to be concerned in a post-patriot Act, post-Snowden era with national security letters and extended executive powers. It’s conceivable that legislation could address some of these concerns, but again that path may not turn out well for either side of this debate.
Human rights violations abroad. If tech companies have to do this for the US government, what other governments will they have to do this for? It becomes much harder for them not to provide such a service to a repressive regime when our own government demands such a service. The only viable solution may be to have devices that the manufacturer cannot crack themselves. It is hard to imagine turning over such requests to an international tribunal to decide when Apple must help unlock a phone.
Attacks get more expensive for law enforcement. If tech companies do not have to support these sorts of requests or move to a more secure solution that prevents their help, this makes low-level attacks on the baseband or hardware the only viable method to recover data. These aren’t full-proof, and they are much more time consuming. So even if we don’t end up losing access to evidence, it certainly becomes more expensive to get at.
Economic damage. We are a digital economy, and much of our innovation is in our tech companies. If we force them to weaken their security while at the same time the U.S. intelligence agencies and LE are widely mistrusted, we incentivize people to use products of companies from other nations. We can’t stop the development of devices abroad, and it is unlikely that we can stop the sale of them here. Even if we can stop sales, we can’t stop software and mathematics. It would be harder for criminals and terrorists to secure their phones, but the algorithms are all out there and a black market is likely to fill the need for the expertise. Just look at how the black market of cybercrime has commoditized almost ever part of the pipeline. We would have better luck taking guns away from “bad guys” than crypto.
Is there any compromise here? Security certainly isn’t binary, and we aren’t talking about losing all access to evidence. In fact, we are talking about losing it in only a subset of circumstances. We are most likely talking about making evidence more expensive to get at. The question remains whether we can avoid some loss of evidence without hurting our own economy and human rights abroad. It’s unclear whether that is the case, and it is quite possible that we could make the situation worse for all parties involved. There may be legislative solutions that can protect our companies from having to handle such requests for repressive regimes that is also transparent enough that people can trust the U.S. not to go down a slippery slope. But trust has to be built over time. So I am skeptical of a quick solution here. At the heart though, this is not a technical problem, but a social one. If the US government can compel manufacturers to attack their own products, regardless of the technology involved, where does that authority stop? That’s the decision before us, and there are risks associated with every answer.