Skip to content

Flickr photo by Dave Newman

Illuminating Going Dark: A Conversation with the FBI

The Going Dark debate. It's been ongoing and reached its boiling point earlier this year when the FBI filed suit against Apple in an attempt to force the company to create a tool to break its default encryption on an iPhone 5c used by one of the San Bernardino shooters.

While the FBI found an alternative method to crack that particular iPhone and the court case stalled, encryption and access to digital evidence is still posing a problem for the bureau when it comes to investigations.

Security Management Assistant Editor Megan Gates sat down with Sasha Cohen O'Connell, the FBI's chief policy advisor for science and technology, to learn more about these issues and the FBI's view of Going Dark. Their conversation has been lightly edited for clarity.

Gates: From the FBI's perspective, what is the Going Dark problem?

O'Connell: The issue for us is the inability to get access to digital evidence. This is not a situation where the U.S. Department of Justice is looking for new authorities; it is about exercising the authority we already have…and our inability to access content data, even with due process.

Gates: When did the FBI begin noticing that it was having a problem obtaining content? Was it before Apple decided to move to default end-to-end encryption on its iPhone operating system?

O'Connell: It was definitely before that point. There are folks in [the FBI headquarters building] who've been working this for over a decade.

It's something we've seen coming, and have been trying to raise warning bells about. The big difference was, as you noted, when things started to go to default end-to-end encryption. With Apple's announcement a year ago, it's just an exponential growth. It's not just the bad guys seeking out end-to-end encryption; it's about the bad guys' associates, and the bad guys' victims, and trying to exonerate people falsely accused of crimes.

When you reach a world of default, you're touching all of those people as well as just the small sub-set that might seek out that kind of end-to-end encryption.

Gates: Beyond encryption, what are some of the other issues that you're seeing when it comes to Going Dark?

O'Connell: Things like anonymization. There are situations where it's increasingly difficult for us to get to attribution—the ability to operate anonymously creates a whole set of issues for us when it comes to investigations. That's one kind of classic example.

There's a more basic example around simple data retention, too. We serve a warrant on a company, and they just frankly don't keep the data. So again, the outcome is the same. We don't have access with legal process to that content that we need.

And then the encryption piece, of course, is just the end-to-end encryption. We don't have a problem with encryption. We love strong encryption; we use strong encryption. We encourage others to do so. We have no issue with encryption. The issue comes when it's only that end user that has access.

We work all the time with companies that use, what we term, provider access. That works wonderfully, in terms of matching up with legal process. The easiest way to explain it is to look at the phone I carry. Nothing on it is classified, but there's a lot of sensitive information on it and it is encrypted.

But you better believe if I get hit by a bus tomorrow, the bureau can get into this. That's because there's an enterprise access point, that obviously we carefully manage. But it exists. And most companies use the same model.

Major e-mail providers, for business reasons, that want to push ads or scan for malware and scan for spam do this. You can't do that if you can't take content, so that's provider and controlled access.

Gates: Why do you think that encryption use has increased dramatically over the past few years? Is it because of the increase in data breaches or a rise in privacy concerns after the Edward Snowden leaks?

O'Connell: We're certainly not the experts. We don't know why the market does what it does. But I think, we encourage the use of encryption. So the FBI is out there saying, 'Use strong encryption.' And that is again, because of the increased focus on data security. As our lives move online, and our personal information moves online, there's no doubt that there's an increased concern around data security. We're leading that charge; there is no doubt about it.

What we want to point out is not throwing the baby out with the bathwater—not having data security at any cost. There are multiple values, so it is a balance between data security and other kinds of public safety issues.

There's never going to be absolute data security, so the question is, where in that range can we be? We see products today that exist in that safe range that also allow us to exercise our lawful authorities. We just don't want to slip to a place where we're in a situation where it's data security at all costs, at the expense of any other values.

If we're heading that way, we just want to make sure that we're doing that in an informed way—that we understand what it means when we go all the way to the end for data security.

Gates: What's your ideal solution to the current problem? If the FBI could have anything it wanted, what would it want to see done?

O'Connell: Director Comey's addressed this recently. He said his job is two things. One, right now we need to keep investigations moving forward. We have no option—that's our obligation. So we will do what we need to do within legal parameters to do that. That's where you've seen some litigation in the past couple months.

Then the other thing is he feels a real obligation to help inform the country to make informed decisions. Because at the end of the day, it is not our decision to make. But what we want to ensure is that people understand the trade-offs and the implications.

Gates: Right now, there's no immediate solution. Beyond what Congress may or may not do, do you think the FBI needs more resources and needs to increase its technical knowledge and skill to keep pace with technology?

O'Connell: We can't resource our way out of this problem. This is a global problem; this is a problem for our state and local partners who will never have the resources. If we continue on the trajectory we're on, there's no way to resource out of that.

Gates: Many counter-arguments have been raised about Going Dark. One report from Harvard University and Hewlett said we're not Going Dark, that we're living in a golden age of surveillance. What do you think of that criticism?

O'Connell: There's more data available today, so there's two issues. One, what's the denominator? Everything's online now, so there's more accessible. But you also have to remember, there's nothing left in the pocket.

In the past, we'd get a subject and there'd be pocket litter or there might be a written diary, or notes by the phone. None of that exists anymore.

Then we get to the nature of the data that's available, and this gets to the metadata conversation. 'Well, can't you just use metadata? Doesn't that solve all your problems?' Metadata is wonderful, we use it all the time. But there are some things that it will never do for us. Metadata will tell me that I'm talking to you…but can never tell us definitively what the content of that conversation is.

That has a number of problems. Maybe we know, based on other things, that we're planning an attack and the timing is being discussed, and the FBI doesn't know. So content is king when it comes to investigation and also prosecution, when you think about showing intent.

As we move up and request authority from the courts or inside our building for additional authorities, we often have to show that the metadata is not enough. If you're going to go up on a wiretap, part of the thing you have to demonstrate is that metadata is not enough.

Gates: Other experts, like those quoted in the MIT report Keys Under Doormats and in the Cryptographer's Panel at the RSA Conference keep saying that the type of encryption the FBI wants people to use, provider access, is introducing vulnerabilities that someone else could potentially exploit and is not a good idea. What's your response to that criticism?

O'Connell: When you talk about what is technically feasible it's important to distinguish between normative, academic conversation and a practical conversation. When you're in the world of normative and you're with folks who are academics—who I love and we want them involved in this process—the conversation is around perfect data security.

When you're in the world of real world limitations, we know there's no such thing as perfect security. People make tradeoffs every day. Perfect security outside of an academic context doesn't really exist.

Academically, they're correct. Any entry point, no matter how managed, does introduce vulnerability. Of course it does. But move over in the real world, where we use real products every day that for convenience, for advertising, for spam tracking, for a thousand reasons that also make sense to us, we're still within a reasonable risk or what the market has accepted as a risk.

Gates: If the Going Dark issue isn't addressed, what's the worst case scenario from the FBI's perspective?

O'Connell: We think about it in four buckets. One is we see a delay in cases. You see this, for example, in San Bernardino. There's a delay. If we can't get access to evidence, there's going to start to be a delay.

The second piece is, we see a diversion of resources. So when we have a situation, we ask people, 'Can you break in?' Sometimes we can. As our head of science and technology says, 'When the moon's aligned just right and I have a coat hanger, sometimes we can.'

But it takes a lot of resources, so we're taking those from other things we're doing. That has some impact on other work if we're just focused on one case and have a situation where we're successful, but it took three units of people two weeks. Again, you've got the delay problem, and now you also have the diversion of resources problem.

After that, you have the two scenarios we worry the most about. One, our inability to prevent something. In that time delay, something happens that we were unable to prevent because we couldn't see that content. Or, on the other side, we can't solve something.

Imagine a world where this is not solved. You're going to have the FBI slower, and state and local as well. You're going to see this across the country.

Gates: The head of EUROPOL also recently came out and said that it's facing the same challenge. Do you see law enforcement beyond the United States having these same issues, and wanting similar access to data?

O'Connell: Absolutely. All of our partners are in the same situation. Everyone we work with; everybody has the same challenges we do.

We're in a unique position because most of the companies that make the best products are here in the United States, so it amps up the issue for us. But there's no doubt that this is an international problem that will, probably, likely be solved with international norms. There's got to be some sort of consensus globally—to the extent that it can happen.

Gates: Is it realistic to think that people are going to come together—internationally—to solve this Going Dark issue?

O'Connell: I think that's what the companies want; frankly, it would be easier for them. What they don't want is a patch work of solutions…we've heard from companies that it would be nice if it was uniform globally because their worst case scenario is a patch work of different rules. And we're starting to head down that path.