I started writing a few tweets but then it turned in to Section 230 braindump time; strap in, everybody. So, I had no intentions of being a lawyer and I thought law was not interesting until after undergrad when I got heavily involved in some little weirdo site called Wikipedia.
Most of my tech policy views are shaped by early involvement in Wikimedia: when it was just hiring its first staff and volunteers did almost everything, and it was just about to become indispensable. It's through that lens that I think Section 230 is important to the internet.
Because when people say 230 is a gift to Big Tech I can only assume their experience of the internet is limited to it. If anything benefits disproportionately, it is the smaller sites, the ones that don't get the gifts of attention and scale.
Section 230 is a procedural rule (thanks, @CathyGellis) that does reduce costs for big companies, it's true. But it also reduces costs for small companies and individual site operators, and that has a much greater proportional effect.
Do you hate the idea of the monoculture where only a few private, unaccountable parties arbitrarily decide what you can and can't say online? Me too. Do you want to participate on a site where no one can moderate what they consider to be garbage? You almost certainly don't.
(No, really, you don't. Many people have tried and if you think this is what you want, go ahead and try it and I will support your efforts. And then when that goes up in flames, come back and join the "moderation is necessary and hard" discussion.)
I support the arbitrariness of site moderation policies *because* of the ability of other sites to exist as competitors. But operating a site that hosts user speech is hard, and thriving user communities require suitable conditions; they are hard to start and easy to kill.
You can't put too many restrictions or too much delay on user contributions or they will simply go somewhere else unless they are extremely motivated. But if one bad user can kill a growing site despite that site's best efforts, what can you do but restrict users?
Before "content moderation" or "trust and safety" were commonplace terms people used, Wikipedia had an active and motivated team of volunteers doing both of these things. (I know; I was one of them.) But things still slipped through. Sometimes terrible ones.
(Famously, the site once proclaimed that John Seigenthaler killed JFK, which is not true; https://en.wikipedia.org/wiki/Wikipedia_Seigenthaler_biography_incident details the incident.)
Less famously and more like a thousand duck bites, it also attracted the ire of scammers whose scams were truthfully described, some of whom did in fact file suits for defamation. At least one of these suits was dismissed using Section 230.
(The background check for law licensure requires listing and description of all lawsuits in which one has been involved personally or as an officer/director. I hope the committee enjoyed my submission, as before my application I had been a board member of WMF for several years.)
Content moderation errors genuinely harm people. To be in favor of Section 230 does not mean pretending that speech does not cause harm. But for those who *want* to do better, it allows moderators to try, without making the site worse off than if it had done nothing.
But to be subject to numerous lawsuits that could not be cheaply dismissed, simply for hosting speech that was not extensively pre-reviewed? The site would have gone under. Until the mid-2000s one relatively uncomplicated suit could have cost the entire annual budget.
The alternative universe pre-reviewed encyclopedia did exist, BTW. It was called Nupedia. Wikipedia was just a sandbox project for it. Nupedia was shuttered after 3 years, with 25 completed articles, and another 150 in progress. Which Wikipedia surpassed in its first month.
It is a great failure when the sites which can moderate do it poorly, ignoring or even incentivizing bad behavior, or even profiting from it. The solution to this is difficult. Because repealing 230 does not stop harmful speech. It makes all speech more costly to host.
Where speech is more costly to host, who will host user speech without monetizing it by selling your data and your attention? Who hosts the speech that is not commercializable, not because it is not valuable but because the market does not accurately reflect its value?
I can guess who *does* host it: the ones with fully-staffed legal and compliance departments, the ones with an advertising team, the ones who decide just to take down the most difficult types of content (marginalized communities, difficult truths, activism) to avoid trouble.
And I would rather have Wikipedia.
(end braindump)
(end braindump)