As Parler disappears from the Android and Ios app stores and faces being kicked off of Amazon's (and other) clouds, people who worry about monopolized corporate control over speech are divided over What It Means.

1/
There's an obvious, trivial point to be made here: Twitter, Apple and Google are private companies. When they remove speech on the basis of its content, it's censorship, but it's not GOVERNMENT censorship. It doesn't violate the First Amendment.

2/
And yes, of course it's censorship. They have made a decision about the type and quality of speech they'll permit, and they enforce that decision using the economic, legal and technical tools at their disposal.

3/
If I invited you to my house for dinner and said, "Just so you know, no one is allowed to talk about racism at the table," it would be censorship. If I said "no one is allowed to say racist things at the table," it would also be censorship.

4/
I censor my daughter when I tell her not to swear. I censor other Twitter users when I hide their replies to my posts. I censor commenters on my blog when I delete their replies.

Dress is up as "content removal" or "moderation" if you'd like, but it's obviously censorship.

5/
That's fine. Different social spaces have different rules and norms. I disagree with some censorship and support other censorship. Some speech is illegal (nonconsensual pornography, specific incitements to violence, child sex abuse material) and the government censors it.

6/
Other speech is distasteful or hateful (slurs, insults) and the proprietors of different speech forums censor it. This legal-but-distasteful speech is a mushy, amorphous category.

7/
I'm totally OK with hilarious dunks on the insurrectionists who stormed the capitol. Tell jokes about Holocaust victims and I'll throw you out of my house or block you.

And when I do, you can go to your house and tell Holocaust jokes.

8/
I'm not gonna lie. I don't like the idea of anyone telling Holocaust jokes anywhere. Or rape jokes. Or racist jokes. But I have made my peace with the fact that there are private spaces where that will happen.

9/
I condemn those spaces and their proprietors, but I don't want them to be outlawed.

Which brings me back to Parler. It's true that no one violates the First Amendment (let alone CDA 230) (get serious) when Parler is removed from app stores or kicked off a cloud.

10/
But we have a duopoly of mobile platforms, an oligopoly of cloud providers, a small conspiracy of payment processors. Their choices about who make speak are hugely consequential, and concerted effort by all of them could make some points of view effectively vanish.

11/
This market concentration didn't occur in a vacuum. These vital sectors of the digital economy became as concentrated as they are due to four decades of shameful, bipartisan neglect of antitrust law.

12/
And while failing to enforce antitrust law doesn't violate the First Amendment, it can still lead to government sanctioned incursions on speech.

The remedy for this isn't forcing the platforms to carry objectionable speech.

13/
I got into a good discussion of this on a private mailing list this morning and then I adapted them and published them in the public "State of the World 2021" discussion on @TheWELL.

https://people.well.com/conf/inkwell.vue/topics/510/State-of-the-World-2021-page04.html#post82

15/
There are three posts: the first deals with Apple and Google's insistence that they removed Parler because it lacked an effective hate-speech filter. Given that there is no such thing as an effective hate-speech filter, this is obvious bullshit.

16/
The second addresses the fundamental problems of moderation at scale, where you are entrusting a large number of employees to enforce policies against "hate speech."

https://people.well.com/conf/inkwell.vue/topics/510/State-of-the-World-2021-page04.html#post83

17/
The biggest problem here is that "almost-hate-speech" is emotionally equivalent to "hate speech" for the people it's directed at. If tech companies specify hate speech, trolls will deploy almost-hate-speech (and goad their targets into crossing the line, then narc them out).

18/
And if tech companies tell moderators to nuke bad speech without defining it, the mods will make stupid, terrible mistakes and users will be thrown into the meat-grinder of the stupid, terrible banhammer appeals process.

19/
The final post asks what Apple and Google should do about Parler?

https://people.well.com/conf/inkwell.vue/topics/510/State-of-the-World-2021-page04.html#post84

20/
They should remove it, and tell users, "We removed Parler because we think it is a politically odious attempt to foment violence. Our judgment is subjective and may be wielded against others in future. If you don't like our judgment, you shouldn't use our app store."

21/
I'm 100% OK with that: first, because it is honest; and second, because it invites the question, "How do we switch app stores?"

eof/
ETA: Here's an ad-free, surveillance-free blog version of this thread as a permalink:

https://pluralistic.net/2021/01/09/the-old-crow-is-getting-slow/#deplatforming
You can follow @doctorow.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.