Deputy Attorney General Jeffrey A. Rosen Delivers Remarks at Justice Department's Lawful Access Summit
Thank you for that kind introduction.
I’d like to thank each of you for being here today. Before I get into my remarks, I hope you’ll join me in recognizing those who have spoken this morning. I’d especially like to salute the courageous survivors and their family members for gracing us with their stories, and for inspiring us with their strength.
The title of today’s summit is “lawless spaces.” And that’s exactly what warrant-proof encryption creates: bounded-off areas in the digital world that are impervious to the light of scrutiny by the judicial system. Those areas remain dark even when a neutral judge has found that the constitutional balance has been satisfied, and the judge has ordered that sworn law enforcement officers should have access to the specified evidence in order to protect public safety.
Outside the digital world, none of us would accept the proposition that grown-ups should be permitted to mingle in closed rooms with children they don’t know in order to groom them for sexual exploitation. Neither would we ever accept the idea that a person should be allowed to keep a hoard of child sexual abuse material from the scrutiny of the justice system when all of society’s traditional procedures for protecting the person’s privacy, like the Fourth Amendment’s warrant requirement, have been satisfied. But in the digital world, that is increasingly the situation in which we find ourselves.
Take, for example, a recent case where information derived from a publicly available peer-to-peer network indicated that, in early 2017, an individual at a particular physical address was requesting child sexual abuse material through that network. Investigators obtained a search warrant and seized the suspect’s computer. It was encrypted. The suspect denied downloading child pornography and subsequently retained counsel. To this day, law enforcement has been unable to access the computer to conduct a search. To this day — over two and a half years later — the suspect remains at large, neither cleared of wrong-doing nor charged with a crime.
Let’s take things one step further. Assume that the suspect did in fact download child sexual abuse material. What if the files on his encrypted device would help us identify child victims of sexual abuse? And, because we could not obtain access to the data stored on that device even with court authorization, what if those abused children are still out there, waiting to be rescued?
These are not sensationalistic scenarios. They are real cases that raise the hard questions that we, as a society, need to confront. Privacy is important. So is cybersecurity, and the integrity of user data. Those are good and necessary things. But there are other important values, like user and public safety, that also need to be considered and factored in. If we lose sight of that, we fail to honor the victims of these horrible crimes — victims who are often the most vulnerable members of our society.
I am not for a moment suggesting that we should “weaken” encryption. As we confront the problem of “warrant-proof” encryption, nobody is calling for secret “back doors” to communications systems, even though that is often how the issue is misreported. As FBI Director Wray said this morning, law enforcement seeks a front door — that is, access through a transparent and publicly acknowledged system, and only once we have secured the authorization of a court. And we don’t want the keys to that door. The companies that develop these platforms should keep the keys, maintaining their users’ trust by providing access to content only when a judge has ordered it.
That is exactly how it is with traditional telecommunications providers. Every day, companies like AT&T, Verizon, and Sprint provide law enforcement with targeted lawful access to the content of phone communications in ways that promote public safety — but only after the government has complied with the rigorous requirements of the law, and a judge has authorized access. Why should internet technology companies operate under different rules? For a young girl who is being trafficked for sex, it makes no difference whether her tormenters are communicating via traditional voice calls over a cell phone, or via an encrypted internet app. But it makes a huge difference to the investigators trying to find her, as they can gather the first category of electronic evidence, but not the second. From a policy point of view, it doesn’t make any sense.
As we’ll hear later this morning when the Attorney General and his international counterparts take the stage, other rule-of-law nations like the United Kingdom and Australia have, through the legislative process, begun addressing this problem. The fact is, this challenge is pervasive, and it touches society at all levels, from state and local, to federal, to international. It will take serious discussion — and real action — at each of those levels for us to adequately confront it. And it will take real effort on the part of all stakeholders for us to move towards solutions.
The good news is that, at least in the child exploitation context, stakeholders have begun coming together to attack the scourge of online child exploitation in powerful ways. NGOs, including parents’ groups and victims’ rights groups, are critical parts of civil society that have raised awareness of this issue. You heard this morning the impactful stories that victims tell, and the important work that organizations like the National Center for Missing & Exploited Children (NCMEC), Team HOPE, and the Canadian Centre for Child Protection undertake to protect and to empower survivors.
You’ve also heard about how the internet has transformed the scope and nature of child exploitation, and how, candidly, it has generated a massive problem for our society. In the face of this problem, law enforcement needs to redouble its efforts, and collaborate in new and creative ways. Other important stakeholders in the broader conversation need to up their game, too. You’ve heard how some companies in the technology industry have made strides in helping confront this challenge by, for example, voluntarily searching communications on their system for known child sexual abuse material using hash values or PhotoDNA. This technique has led to an explosion of reports of suspected child pornography offenses to NCMEC’s CyberTipline, exceeding 18 million last year. Since 2010, the number of CyberTips that have been sent to the Internet Crimes Against Children Task Forces for investigation by law enforcement have increased by more than 555 percent. And as John Walsh told you, the resulting investigations have resulted in the rescue of children who were being sexually assaulted. Law enforcement benefits from the leads voluntarily generated by private sector companies, and we truly thank them for their work. But, unfortunately, the numbers themselves make clear that there is significantly more to be done.
And that’s where we run into a more difficult part of the conversation. The monitoring practices I’ve described are inconceivable with end-to-end encryption, which will dramatically reduce the number of reports to the CyberTipline. For example, an estimated 70 percent of Facebook’s reporting, which last year totaled well over 16 million reports of child sexual exploitation and abuse globally, would likely be lost if the company deploys end-to-end encryption across all of its platforms, as it has publicly announced it plans to do. And we have little insight into the actual volume of child sexual abuse material being traded on platforms that already use end-to-end encryption.
It’s worth focusing for a moment on the platforms that employ end-to-end encryption, because the metrics we do have are chilling. Consider Apple, which reported a grand total of eight tips in 2017; 43 in 2018; and less than 150 to date in 2019. When contrasted to the millions of tips reported by Facebook over the same time frame, is the take-home point that Apple magically ran platforms free of child exploitation as the volume of child exploitation materials grew by massive amounts everywhere else on the internet? Or is that such companies cannot see the harmful illicit activity that was occurring on their platforms when they chose to avert their eyes by deploying end-to-end encryption? Simply stated, end-to-end encryption prevents many investigations from even getting started, and leaves many victims undetected. Some companies have completely favored the privacy of their users over the safety of their users.
My hope is that today’s summit drives that point home. Right now, technology companies make unilateral decisions that have profound impacts on public safety. Our society needs to understand the stakes. Protecting users of the internet and communications services often entails monitoring what is traveling through those systems, whether it is malware, stolen intellectual property, terrorists’ communications, or images of child sexual abuse. If we are to move to a world where even judge-approved search warrants become useless to the protection of exploited children, and to public safety more broadly, our country needs an open discussion of the costs some such technology platforms will be imposing on all of us. If our efforts to make the virtual world more secure leave us more vulnerable in the physical world, that decision should be an informed one.
One can strongly support privacy and civil liberties, but still find it hard to fathom why a digital tech platform would provide warrant-proof spaces for child exploiters, or even terrorists, when a judge has found a lawful need for access consistent with the Fourth Amendment to the US Constitution. Why not have a regime where cybersecurity and encryption remain compatible with lawful access to data when a judge issues a search warrant?
I thank each of you for being here today to contribute to this critically important discussion.
Updated October 4, 2019