Apple v. FBI: A Risk-based Discussion

Any opinion here, which I have tried to minimize, is just my own and not that of the National Center for Supercomputing Applications or the University of Illinois where I work. I am merely trying to bring the perspective of a security professional and CISO who thinks in terms of risks and not absolutes to the discussion, which has to date been filled with superlatives. While some components of the debate are truly binary, the debate as a whole is not.

This is NOT about:
  • Privacy or mass surveillance. We give up a lot more privacy just by virtue of having a smart phone. And this case, at least in the short term, is about targeted interventions that would require Apple’s assistance on a case-by-case basis.
  • Backdoors. A general backdoor capability is not being requested by the FBI. You can tie a software update to a device ID and by digitally signing the software, you make it something the FBI and others cannot tamper with before loading. Apple is not being asked to give up their private signing key, at least yet.
  • Going dark. This is not about “going dark” as the intelligence and law enforcement agencies have been saying. Police didn’t even have this treasure trove of data a decade ago to “lose”. And as many have pointed out, we are in a golden age of surveillance.
  • Terrorists and pedophiles running amok. People have always taken secrets to the grave. People have always been able to destroy evidence, burn records, etc. This is not new, and this is not a binary question of having or not having evidence anymore.
  • Unhackable black boxes. Information security has never been harder. There are always vulnerabilities, especially to targeted attacks. If the target is still alive, you can trick them into downloading malware, monitor their keystrokes with tempest technologies, or even tamper with the device while it is out of their control. Even in this specific case it seems there are likely other approaches to attack the hardware.
  • Full device encryption. Full disk encryption has been available for two decades, and it has been built into major operating systems for at least a decade. If I am dead and my laptop is off, good luck getting that data. Even backup solutions support strong encryption now.
What’s Changed?

Now it would be disingenuous to say nothing has changed at all for either tech companies or law enforcement (LE). The first change is that LE’s free ride may be over. Phones are becoming more like laptops with high-grade security for data at rest. When smart phones first came out there was a period of having all the data sources of a typical PC, without any way to really protect it like a computer. No doubt it would be frustrating to lose that.

Second, high-grade security is on by default. This still isn’t true of your laptop, even though it is trivial to turn on. Recent changes by Apple have made this so on their mobile platforms, and this is very good for the many people who lose their devices each year. However, it is sure to be a frustration to LE as much as it is to criminals.

Third, in a post-Snowden era US tech companies are less trusted internationally and the citizens of the U.S. trust their own government less. This threatens our tech companies’ business and a key sector of the U.S. economy, and so they are responding to this threat. Devices that manufacturers cannot attack themselves make them more attractive, especially in international markets.

The Real Trade-offs

There are lots of angles to discuss in this whole saga: legal implications, motives, politics, economics, etc. I would like to focus on what the real risks are and actual trade-offs. We can’t make informed decisions without understanding these, and understanding these trade-offs may offer a compromise that mitigates some of the risks.

Evidence is lost. There is no doubt that strong encryption will have an impact on investigations, particularly for the deceased. We have seen this happen with laptops and computers used by criminals for at least two decades. However, we should be reassured that this doesn’t seem to be a dead end too often. A good reason why is that we are also in the golden age of surveillance. We have more devices, more accounts, more network traffic than ever to analyze for relationships. Our digital footprint is nearly impossible to stomp-out, even for the security professional, and it isn’t getting easier. We need to get away from the worst case scenario of the terrorist or pedophile that might go free. That just pushes this conversation into hyperbole.

We create poor, reactionary legislation. President Obama is quite correct to cast this as a risk to the tech industry. Pushing back as Apple has could force legislative answers, and these could easily make things worse and not better. The history is not good here.

A backfire effect. By pushing Apple so hard in this case, the FBI is incentivizing them to create even more secure phones. It would not be hard to make their security enclave wipe its keys before allowing a firmware update. And if Apple does this, it is likely that competitors will do likewise to compete on security features. This could instigate a stronger ecosystem of phone security, much to the chagrin of LE. It could be argued that we got to this point as a reaction to the actions of the U.S. intelligence agencies and LE. Now Congress could potentially address this in legislation, but even then they cannot control international manufacturers.

All writs slippery slope. We don’t really know where this ends if tech companies can be compelled to trojan their own software or devices. Will they have to do it in secret? How pervasive would that be? Would they be forced to do such updates over-the-air and remotely? Citizens have a reason to be concerned in a post-patriot Act, post-Snowden era with national security letters and extended executive powers. It’s conceivable that legislation could address some of these concerns, but again that path may not turn out well for either side of this debate.

Human rights violations abroad. If tech companies have to do this for the US government, what other governments will they have to do this for? It becomes much harder for them not to provide such a service to a repressive regime when our own government demands such a service. The only viable solution may be to have devices that the manufacturer cannot crack themselves. It is hard to imagine turning over such requests to an international tribunal to decide when Apple must help unlock a phone.

Attacks get more expensive for law enforcement. If tech companies do not have to support these sorts of requests or move to a more secure solution that prevents their help, this makes low-level attacks on the baseband or hardware the only viable method to recover data. These aren’t full-proof, and they are much more time consuming. So even if we don’t end up losing access to evidence, it certainly becomes more expensive to get at.

Economic damage. We are a digital economy, and much of our innovation is in our tech companies. If we force them to weaken their security while at the same time the U.S. intelligence agencies and LE are widely mistrusted, we incentivize people to use products of companies from other nations. We can’t stop the development of devices abroad, and it is unlikely that we can stop the sale of them here. Even if we can stop sales, we can’t stop software and mathematics. It would be harder for criminals and terrorists to secure their phones, but the algorithms are all out there and a black market is likely to fill the need for the expertise. Just look at how the black market of cybercrime has commoditized almost ever part of the pipeline. We would have better luck taking guns away from “bad guys” than crypto.

The Road Ahead

Is there any compromise here? Security certainly isn’t binary, and we aren’t talking about losing all access to evidence. In fact, we are talking about losing it in only a subset of circumstances. We are most likely talking about making evidence more expensive to get at. The question remains whether we can avoid some loss of evidence without hurting our own economy and human rights abroad. It’s unclear whether that is the case, and it is quite possible that we could make the situation worse for all parties involved. There may be legislative solutions that can protect our companies from having to handle such requests for repressive regimes that is also transparent enough that people can trust the U.S. not to go down a slippery slope. But trust has to be built over time. So I am skeptical of a quick solution here. At the heart though, this is not a technical problem, but a social one. If the US government can compel manufacturers to attack their own products, regardless of the technology involved, where does that authority stop? That’s the decision before us, and there are risks associated with every answer.

About Adam Slagell

Adam Slagell currently serves as the director of the Cybersecurity Division and Chief Information Security Officer at the National Center for Supercomputing Applications (NCSA) where he co-leads the security team for the NSF-funded XSEDE federation, serves as liaison for the Bro Project at the Software Freedom Conservatory, and is a co-PI for the NSF Bro Center of Excellence, which brings its network security monitoring expertise and support to NSF-funded cyber-infrastructure and Higher Ed.
This entry was posted in Technology. Bookmark the permalink.

6 Responses to Apple v. FBI: A Risk-based Discussion

  1. Good article. Good points. Good luck conveying the sense of it to public and politicians.
    The whole thing smells very, very fishy. I suspect a bit of double-bluff somewhere, no matter what the news says. No matter; let everyone work out his own conspiracies.
    This is a situation as transient as PC floppy disks. A few points:
    * ALL useful encryption systems are easily crackable; they have to be useful to the user at least.
    * Any competent cryptologist should be able to design whole classes of encryption even theoretically (let alone in practice) uncrackable without the right key, even if the enemy cryptanalyst were permitted unlimited chosen-code attacks.
    * The entire backdoor concept is flawed, flawed, flawed, so wrong-headed that even Jacks in office and politicians should be able to see it. It would be simpler to make it an offence not to reveal data when ordered to do so by a court under appropriate circumstances. And simpler to control what happens to the data, without jeopardising other users’ rights or privacy.
    * Once it were recognised that off-the-shelf or built-in encryption is not a protection against directed, sanctioned attack, anyone but an idiotic crook would design or go on the market for a custom-built uncrackable encryption scheme, and all the standard cackle about backdoors could be relegated to lovers’ letters and the like.
    * The government would generally not bother with the backdoors, because no one in serious need of serious security would use anything but uncrackable encryption. Practically any other traffic would hardly ever be of interest to anyone able to crack it.

  2. Mike says:

    Thanks for a thorough analysis.

  3. Carl Seghers says:

    The create software that is tied to a device, you need (1) to create the software, and (2) tie it to the device. After you’ve done (1) the cat is loose, that’s the whole point..

  4. Chris Jones says:

    I appreciate the thorough look at this, and the avoidance of the red herrings. I have repeatedly taken to task people who have made a big issue of “privacy” in the encryption debate, for numerous reasons. Most directly, because even if you presume to have privacy despite using a smartphone which is inherently leaky, championing privacy is a wrong-headed means of garnering sympathy and really is the least of your concerns if strong encryption becomes compromised. If you’re more worried about someone at the NSA seeing your amorous text messaging session than a criminal utilizing the same back-door or stolen escrowed keys to grab your credit card number as you’re ordering a pizza, your priorities are way the hell out of whack. If you’re making a case to your representative to avoid weakening encryption, which is going to play better: “make it easier for criminals to hide from police so that my text messages are kept private” or “make it easier for some criminals to hide so that other criminals can’t steal billions of dollars a year because encryption has been screwed over”? I tend to think that they’ll care less about the sanctity of your text messages than commerce in general.

  5. CS says:

    I totally agree this is not a privacy issue and am troubled that that is what most people think.

    My concerns are:

    the dangerous precedent set by the absurd misuse of a very old, very vague, very broad Act to incredibly complex technical, legal, ethical and national security issues, which the drafters could never have imagined, let alone contemplated;

    allowing the government by judicial act to compel a private person (here, a corporation but it applies equally to an individual) TO DO SOMETHING, which it otherwise has no legal obligation to do.

  6. Loraine O says:

    correct me if i’m wrong. i believe that especially on spy movies. they won’t have this broad idea on the investigative story that they make if they ain’t got that much knowledge about something. everything comes from something big or small.. especially the gadgets! after watching 007 movies when a new movies’ up. i just feel so compromised. you just blame me that yet i’m a girl. nevertheless, great movies though, only shows how far we’ve come starting from stones as tools! -V,,

Leave a Reply

Your email address will not be published. Required fields are marked *