Document Type


Publication Date





In recent years, dominant social media platforms like Facebook and Twitter have been increasingly perceived as engaging in discrimination against conservative and right-wing viewpoints – especially by conservatives themselves. Such concerns were exacerbated by Twitter and Facebook’s deplatforming of then-President Trump in response to the president’s tweets and posts leading up to and during the January 6 th insurrection. Trump’s deplatforming, coupled with the recent actions taken by the platforms in removing Covid- and election-related misinformation, led to cries of censorship by conservative and increased calls for regulation of the platforms. Supreme Court Justice Thomas took up this charge (in an opinion relating to a different controversy involving Trump’s Twitter practices) and suggested a regulatory path forward for lawmakers seeking to hold the platforms liable for alleged viewpoint discrimination against and censorship of conservative voices. Justice Thomas’s suggested playbook for regulation was adopted by several state and federal lawmakers, who have proposed a host of legislative measures designed to address these concerns.

In this Article, I examine the desirability and constitutionality of recent federal and state legislative initiatives that seek to provide remedies for these alleged ills—including the proposed federal DISCOURSE Act, the 21 st Century FREE Speech Act, the PRO-SPEECH Act, and the PACT Act, as well as state laws like those enacted in Florida and Texas and introduced in every state in the country that seek to rein in the dominant platforms’ discretion exercised in content moderation decisions to prohibit them from engaging in viewpoint discrimination (whether human moderated or algorithmically implemented), and to impose notice, transparency and other due process-type obligations on these platforms. This Article analyzes the key elements of such proposed legislation in light of the obligations that the U.S. government historically has imposed on common carriers and broadcasters. This Article then examines the procedural dimensions of our free speech commitments and values and our commitments to due process, including those enshrined in the International Covenant on Civil and Political Rights (which was referenced by the Facebook Oversight Board in its review of Facebook’s suspension of Trump from its platform). These due process principles require that speech regulations be clear and precise, that those subject to regulation be provided clear notice of such regulations, that the regulations be enforced in a manner that is non-discriminatory and transparent, and that enforcement be subject to an opportunity to challenge—especially where the consequences of such enforcement are substantial.

This Article concludes with a favorable assessment of the desirability and constitutionality of certain aspects of proposed legislation that would require platforms like Facebook and Twitter to comply with certain principles of nondiscrimination and due process as recognized under the First Amendment, the Due Process Clause, and the International Covenant on Civil and Political Rights, and that would prohibit these platforms from engaging in certain types of viewpoint discrimination or speaker-based discrimination. This Article contends that, while the platforms should continue to enjoy the discretion to regulate many categories of speech that would otherwise be protected by the First Amendment (such as threats, non-obscene pornography, medical misinformation, etc.) and to moderate content and restrict speakers when in clear violation of their terms of service, the dominant platforms should not engage in blatant viewpoint or speaker-based discrimination and should accord their users certain due process type protections – including the right to receive meaningful advance notice of the platforms’ content guidelines and terms of service; clear notice when users’ speech is censored or otherwise regulated or when the speaker herself is deplatformed; information about what particular content guideline was allegedly violated; and a meaningful opportunity to challenge content moderation in cases where such moderation severely restricts their exercise of free speech.

GW Paper Series


Included in

Law Commons