Skip to main content

DEPARTMENT OF JUSTICE’S REVIEW OF SECTION 230 OF THE COMMUNICATIONS DECENCY ACT OF 1996

As part of the President's Executive Order on Preventing Online Censorship, and as a result of the Department's long standing review of Section 230, the Department has put together the following legislative package to reform Section 230. The proposal focuses on the two big areas of concern that were highlighted by victims, businesses, and other stakeholders in the conversations and meetings the Department held to discuss the issue.  First, it addresses unclear and inconsistent moderation practices that limit speech and go beyond the text of the existing statute. Second, it addresses the proliferation of illicit and harmful content online that leaves victims without any civil recourse. Taken together, the Department's legislative package provides a clear path forward on modernizing Section 230 to encourage a safer and more open internet.

Cover Letter: A letter to Congress explaining the need for Section 230 reform and how the Department proposes to reform it.

Redline: A copy of the existing law with the Department's proposed changes in redline.

Section by Section: An accompanying document to the redline that provides a detailed description and purpose for each edit to the existing statute.

As part of its broader review of market-leading online platforms, the U.S. Department of Justice analyzed Section 230 of the Communications Decency Act of 1996, which provides immunity to online platforms from civil liability based on third-party content and for the removal of content in certain circumstances.  Congress originally enacted the statute to nurture a nascent industry while also incentivizing online platforms to remove content harmful to children.  The combination of significant technological changes since 1996 and the expansive interpretation that courts have given Section 230, however, has left online platforms both immune for a wide array of illicit activity on their services and free to moderate content with little transparency or accountability. 

The Department of Justice has concluded that the time is ripe to realign the scope of Section 230 with the realities of the modern internet.  Reform is important now more than ever.  Every year, more citizens—including young children—are relying on the internet for everyday activities, while online criminal activity continues to grow.  We must ensure that the internet is both an open and safe space for our society.  Based on engagement with experts, industry, thought-leaders, lawmakers, and the public, the Department has identified a set of concrete reform proposals to provide stronger incentives for online platforms to address illicit material on their services, while continuing to foster innovation and free speech.  Read the Department’s Key Takeaways.

The Department's review of Section 230 arose in the context of our broader review of market-leading online platforms and their practices, announced in July 2019.  While competition has been a core part of the Department’s review, we also recognize that not all concerns raised about online platforms (including internet-based businesses and social media platforms) fall squarely within the U.S. antitrust laws.  Our review has therefore looked broadly at other legal and policy frameworks applicable to online platforms.  One key part of that legal landscape is Section 230, which provides immunity to online platforms from civil liability based on third-party content as well as immunity for removal of content in certain circumstances.

Drafted in the early years of internet commerce, Section 230 was enacted in response to a problem that incipient online platforms were facing.  In the years leading up to Section 230, courts had held that an online platform that passively hosted third-party content was not liable as a publisher if any of that content was defamatory, but that a platform would be liable as a publisher for all its third-party content if it exercised discretion to remove any third-party material. Platforms therefore faced a dilemma:  They could try to moderate third-party content but risk being held liable for any and all content posted by third parties, or choose not to moderate content to avoid liability but risk having their services overrun with obscene or unlawful content.  Congress enacted Section 230 in part to resolve this quandary by providing immunity to online platforms both for third-party content on their services or for removal of certain categories of content.  The statute was meant to nurture emerging internet businesses while also incentivizing them to regulate harmful online content.  

The internet has changed dramatically in the 25 years since Section 230’s enactment in ways that no one, including the drafters of Section 230, could have predicted.  Several online platforms have transformed into some of the nation’s largest and most valuable companies, and today’s online services bear little resemblance to the rudimentary offerings in 1996.  Platforms no longer function as simple forums for posting third-party content, but instead use sophisticated algorithms to promote content and connect users.  Platforms also now offer an ever-expanding array of services, playing an increasingly essential role in how Americans communicate, access media, engage in commerce, and generally carry on their everyday lives.

These developments have brought enormous benefits to society.  But they have also had downsides.  Criminals and other wrongdoers are increasingly turning to online platforms to engage in a host of unlawful activities, including child sexual exploitation, selling illicit drugs, cyberstalking, human trafficking, and terrorism.  At the same time, courts have interpreted the scope of Section 230 immunity very broadly, diverging from its original purpose.  This expansive statutory interpretation, combined with technological developments, has reduced the incentives of online platforms to address illicit activity on their services and, at the same time, left them free to moderate lawful content without transparency or accountability.  The time has therefore come to realign the scope of Section 230 with the realities of the modern internet so that it continues to foster innovation and free speech but also provides stronger incentives for online platforms to address illicit material on their services. 

Much of the modern debate over Section 230 has been at opposite ends of the spectrum.  Many have called for an outright repeal of the statute in light of the changed technological landscape and growing online harms.  Others, meanwhile, have insisted that Section 230 be left alone and claimed that any reform will crumble the tech industry.  Based on our analysis and external engagement, the Department believes there is productive middle ground and has identified a set of measured, yet concrete proposals that address many of the concerns raised about Section 230. 

A reassessment of America’s laws governing the internet could not be timelier.  Citizens are relying on the internet more than ever for commerce, entertainment, education, employment, and public discourse.  School closings in light of the COVID-19 pandemic mean that children are spending more time online, at times unsupervised, while more and more criminal activity is moving online.  All of these factors make it imperative that we maintain the internet as an open and safe space.