GW Law Faculty Publications & Other Works

Document Type

Article

Publication Date

2024

Status

Forthcoming

Abstract

Section 230 of the Communications Decency Act generally immunizes online platforms such as Facebook, YouTube, Amazon, and Uber from liability for third-party user content (e.g., posts, comments, videos) and for moderation of that content. This article addresses an important issue overlooked by both defenders and critics of Section 230: the implications of the law and proposed reforms for Black communities. By relieving tech platforms of most legal liability for third-party content, Section 230 helps facilitate Black social activism, entrepreneurship, and artistic creativity. Further, Section 230 also relieves platforms of most legal liability for content moderation, which boosts platforms’ freedom to remove or downrank unlawful activity, as well as an array of “lawful but awful” content that government is constitutionally unable to restrict—such as hate speech, white supremacy organizing, medical disinformation, and political disinformation. However, unfortunately, platforms’ overly broad interpretations of Section 230 also provide incentives for platforms to allow unlawful activity directed at Black communities, such as harassment, white supremacist violence, voter intimidation, and housing and employment discrimination, and to prevent legal recourse when platforms erroneously downrank Black content. These insights provide factors that can help policymakers assess whether proposed Section 230 reforms—such as notice-and-takedown, content moderation neutrality, and carve-outs to immunity for algorithmic recommendations and advertisements—will benefit or harm Black communities.

GW Paper Series

2024-40

Included in

Law Commons

Share

COinS