Internet companies were once the darlings of Capitol Hill, celebrated by lawmakers as examples of American innovation. It’s safe to say the honeymoon is over.
Online platforms have been too easily weaponized by cybercriminals, terrorists, harassers, revenge pornographers and foreign adversaries in recent years, and many in Congress have rightly concluded that tech companies have not taken such threats seriously enough.
As a result, Congress is reportedly considering the elimination or scaling back of Section 230 of the Communications Decency Act, which states that, unless one of a few narrow exceptions apply, websites are not legally responsible for user-generated content.
The most important part of Section 230 is only 26 words long, but the law arguably created the internet we know today. An all-out repeal of the provision would have sweeping social consequences.
Section 230 was enacted in 1996, but its origins can be traced back to a 1959 U.S. Supreme Court case and, of all things, a Los Angeles bookstore. In 1956, Eleazar Smith, the 72-year-old proprietor of a bookstore that was located on Main Street a few doors down from where the Nickel Diner currently sits, was arrested after a clerk at the store sold a copy of the pulp novel “Sweeter Than Life” by Mark Tryon, considered obscene under city and state law, to a Los Angeles police officer.
At trial, Smith testified that it took him months to read a single book, and therefore that there was no way he could personally review each of the thousands of books in his store. A local judge disagreed and sentenced Smith to 30 days in jail.
The U.S. Supreme Court reversed Smith’s conviction, concluding that California law violated the 1st Amendment because the statute penalized Smith even if he had no reason to know of the obscene book. Smith vs. California resulted in 1st Amendment protections for bookstores, newsstands and other content distributors.
For decades, these protections were not terribly difficult to apply — until the early 1990s, when companies such as CompuServe and Prodigy began connecting their customers’ computers to the internet.
CompuServe faced a defamation lawsuit over an article it distributed in a newsletter. A federal judge in Manhattan dismissed the suit, holding that CompuServe receives the same 1st Amendment protections as Smith.
A few years later, a financial executive, angry about anonymous posts on an online bulletin board, filed a defamation lawsuit against Prodigy. A New York state court judge on Long Island denied Prodigy the protection that Smith and CompuServe received, reasoning that Prodigy had moderated its user content. Because Prodigy attempted to block defamation, smut and hate speech from its services, it was deemed liable for every word that its users wrote.
Congress recognized that this outcome was untenable, and Section 230 was included in the Communications Decency Act. Its key words are: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Without the certainty provided by Section 230, Yelp, Facebook, Twitter, Snapchat, Google and many other platforms simply wouldn’t exist in their current forms. Saddled with the legal responsibilities of traditional publishers, no company could afford to defend the accuracy of every tweet, post or video created by its users.
Moreover, Section 230 is what allows people to have a voice online. Without the provision, the #MeToo movement would not have grown as quickly as it did, and viral videos of people engaging in racist behavior would not raise awareness among millions of people.
But the internet that Section 230 created is far from a utopia, as one of the provision’s authors acknowledged last year. “I’ve written laws to keep the old rules off your back,” Sen. Ron Wyden of Oregon told tech companies at a content-moderation conference at Santa Clara University. “And I did it under the idea that it was possible for technology leaders to do better. I’m concerned that your employers are now proving me wrong, and time is running out.”
We don’t need to choose between the status quo and an all-out repeal of Section 230, however. Instead, platforms should immediately revamp their content moderation policies and procedures, as some are now starting to do, beginning with more moderators and better automated technology.
No moderation system will be perfect enough to catch all illegal and harmful content, but there is significant room for improvement. Repealing Section 230 would be an overcorrection, one that would fundamentally alter the internet for the worse.
Congress provided Section 230’s extraordinary immunity because it thought that technology companies could responsibly oversee the use of their services. Many platforms have failed to live up to their end of the bargain. They should do so now, or risk losing the free speech that undergirds the modern internet.