Analysis

Updating the Internet

Restoring the Vision of Section 230


Included below is an executive summary of  Updating the Internet: Restoring the Vision of Section 230. To read the abridged or full report, click one of the links below.

Executive Summary

In 1994, the internet was still an experiment. At the beginning of the year, just 623 websites existed around the globe and the World Wide Web remained a little kingdom cloistered away with researchers, hobbyists, and early adopters. By the end of the year, however, that number had exploded to nearly 10,000 websites and between January and December of the following year, the web grew to more than 100,000. This period marked the birth of the commercial internet: Amazon and eBay were founded, Netscape emerged as the first successful web startup, and its August 1995 initial public offering signaled that the internet was no longer a niche technology, but an economic force capable of transforming the United States.

It was against this backdrop of rapid innovation that Congress enacted the Communications Decency Act of 1996 (CDA). At the time, policymakers sought to protect fledgling internet companies, encourage competition, and ensure that the United States could plant its flag firmly in the emerging global digital marketplace. Central to that effort was Section 230 of the CDA, which established two key provisions: Section 230(c)(1), governing the treatment of interactive computer services as publishers or speakers of third-party content, and Section 230(c)(2), the “Good Samaritan” provision, intended to encourage voluntary content moderation without exposing companies to liability.

Thirty years later, however, the internet that Section 230 was designed to govern bears little resemblance to the digital ecosystem that exists today. A handful of powerful technology companies now shape how billions of people receive information, interact with one another, and engage in public discourse. Yet the legal framework governing platform liability has remained largely unchanged.

This paper does not call for repealing Section 230 or predetermining the outcome of lawsuits. Instead, it advances a more practical argument: the provisions of Section 230 that continue to serve their original purpose should be preserved, while interpretations that have stretched the statute beyond recognition should be corrected. The paper (1) traces Congress’s original intent, (2) explains how judicial interpretation has drifted from that purpose, (3) proposes targeted statutory clarifications to realign the law, and (4) addresses common concerns about reform.

Importantly, Section 230(c)(1) and Section 230(c)(2) serve distinct purposes, and reform must preserve that distinction. Section 230(c)(1) protects platforms from being treated as the publisher of third-party speech, but it was never intended to operate as blanket immunity for a platform’s own conduct. Section 230(c)(2), by contrast, protects companies that act in good faith to remove harmful content. Preserving (c)(2) is essential to ensure platforms can continue moderating responsibly while safeguarding free expression.

To restore this balance, this paper proposes four targeted clarifications to Section 230.

  1. Part I: Update Section 230 to reflect a forward-looking vision of the internet — Remove the statutory assumption that a “vibrant and competitive free market” currently exists for interactive computer services, and reframes competition as a policy objective rather than a factual premise.
  2. Part II: Clarify the distinction between speech and conduct — Clarify that Section 230(c)(1) applies to protected user speech, not all “information,” allowing courts to determine whether the alleged harm arises from expressive or non-expressive activity, such as design choices that are independent of what users say.
  3. Part III: Allow for distributor liability — Allow courts to apply traditional distributor liability that has been previously foreclosed by cases such as Zeran v. America Online, Inc., by clarifying that courts may evaluate whether platforms knew or should have known about unlawful third-party activity they facilitated and failed to address. In such cases, platforms should not be shielded by Section 230 immunity.
  4. Part IV: Limit immunity for platform design and features — Affirm that Section 230(c)(1) does not apply to platform design, ensuring accountability for product choices that amplify, recommend, or otherwise shape potentially illegal or tortious content.

Together, these reforms preserve the core protections Congress intended while restoring accountability for platform conduct that falls outside the statute’s original purpose. Section 230 helped nurture the early internet. Updating it for the modern digital environment will ensure the law continues to support innovation, competition, and responsible stewardship of the online public sphere.