How To Fix Section 230

49 Pages Posted: 11 Mar 2022 Last revised: 6 Nov 2023

Date Written: March 10, 2022

Abstract

Section 230 is finally getting the clear-eyed attention that it deserves. No longer is it naïve to suggest that we revisit the law that shields online platforms from liability for illegality that they enable. The harm wrought is now undeniable, especially for victims of online assaults and intimate privacy violations. Time and practice have made clear that tech companies don’t have enough incentive to remove or otherwise combat online abuse, especially if it generates likes, clicks, and shares. Victims can’t sue sites that earn advertising fees from their suffering. The status quo is particularly costly for women, children, and minorities who lose their ability to speak, work, and love in the face of online abuse. Doing nothing says that society is fine with the vulnerable enduring abuse that robs them of their civil rights and civil liberties.

We need to fix Section 230. Reform must be approached with humility and care, lest it spur platforms to over-or under-moderate in ways that do more harm than good for the very people who most need help. The legislative solution offered here grows out of a decade of experience working with tech companies, victims of online abuse, and legislative staff. While the over-filtering provision, 230(c)(2), should be preserved and affirmed, the under-filtering provision, 230(c)(1), requires revision. First, the under-filtering provision should not extend to sites that purposefully or deliberately encourage, solicit, or keep up intimate privacy violations, cyber stalking, or cyber harassment. Those bad actors should have no right to invoke the legal shield. Second, the under-filtering provision should be conditioned on a duty of care in certain circumstances. In cases involving intimate privacy violations, cyber stalking, or cyber harassment, platforms would enjoy immunity only if they could prove that they took reasonable steps to address such abuse, even if they failed to tackle it in a particular case. That way, platforms would have a legal incentive to design content moderation practices to address abuse that inhibits self-expression and ruins livelihoods. Rather than an unguided duty of care, lawmakers should specify the obligations involved, drawing on key lessons from the trust and safety field.

Keywords: Free speech, privacy, cyber law, legislation, torts, criminal law

Suggested Citation

Citron, Danielle Keats, How To Fix Section 230 (March 10, 2022). Boston University Law Review, Forthcoming, Virginia Public Law and Legal Theory Research Paper No. 2022-18, Available at SSRN: https://ssrn.com/abstract=4054906

Danielle Keats Citron (Contact Author)

University of Virginia School of Law ( email )

580 Massie Road
Charlottesville, VA 22903
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
2,122
Abstract Views
6,971
Rank
13,543
PlumX Metrics