Only the state is (even theoretically) obligated to respect constitutional rights. Corporate platforms are not bound by constitutional limits.
Even if they were, Trump’s incitements to violence stretch well beyond constitutionally protected speech. /2 https://twitter.com/davidsacks/status/1347924658843234307
Even if they were, Trump’s incitements to violence stretch well beyond constitutionally protected speech. /2 https://twitter.com/davidsacks/status/1347924658843234307
If anything, the problem is not the precedential value of Twitter’s decision vis-a-vis Trump, but rather its seemingly arbitrary choice over which particular tweets to cite as the basis for Trump’s suspension.
He has been inciting violence for a long time. Why only now? /3 https://twitter.com/0rf/status/1347993653264998404
He has been inciting violence for a long time. Why only now? /3 https://twitter.com/0rf/status/1347993653264998404
The reason Twitter didn’t act until now is because, after Wednesday’s attacks, a swath of previously oblivious people grew more attentive to facts they had long chosen to ignore.
That only demonstrates the general arbitrariness of corporate content moderation. /4 https://twitter.com/ananavarro/status/1347712875243188225
That only demonstrates the general arbitrariness of corporate content moderation. /4 https://twitter.com/ananavarro/status/1347712875243188225
There is an irony here worth noting. Trump recently vetoed the NDAA, demanding reforms to Section 230 of the Communications Decency Act.
He described 230 as a liability shield protecting Silicon Valley tech companies.
That’s partly true, but tells only half the story. /5 https://twitter.com/evan_greer/status/1348331669254774785
He described 230 as a liability shield protecting Silicon Valley tech companies.
That’s partly true, but tells only half the story. /5 https://twitter.com/evan_greer/status/1348331669254774785
230 is a liability shield for platforms, but its ultimate beneficiaries are Internet users.
Because they’re not state actors, tech firms are at liberty to censor user-generated content—but they historically haven’t had an incentive to do so.
Section 230 enables that regime. /6 https://twitter.com/latimesopinion/status/1347166194902700032
Because they’re not state actors, tech firms are at liberty to censor user-generated content—but they historically haven’t had an incentive to do so.
Section 230 enables that regime. /6 https://twitter.com/latimesopinion/status/1347166194902700032
Removing the liability shield would *create an incentive* for tech firms to censor their users.
Corporate content moderation has long been arbitrary. (See #4 above)
But SESTA & FOSTA showed how platforms bearing liability for user speech can place people at physical risk. /7 https://twitter.com/repjayapal/status/1207075759686397952
Corporate content moderation has long been arbitrary. (See #4 above)
But SESTA & FOSTA showed how platforms bearing liability for user speech can place people at physical risk. /7 https://twitter.com/repjayapal/status/1207075759686397952
We offer a stronger, more visionary solution to the concerns animating calls to reform Section 230.
Rather than repeal the platform liability shield, we propose mandating platform interoperability.
That would unleash users to choose their platforms, with profound beneifts. /8 https://twitter.com/doctorow/status/1288854906422439937
Rather than repeal the platform liability shield, we propose mandating platform interoperability.
That would unleash users to choose their platforms, with profound beneifts. /8 https://twitter.com/doctorow/status/1288854906422439937
As it relates to content moderation, mandating interoperability would create avenues to circumvent arbitrary corporate decisions.
If one platform de-platforms or silences a dissident, they could then effectively take their audience and content to another.
Today? No chance. /9
If one platform de-platforms or silences a dissident, they could then effectively take their audience and content to another.
Today? No chance. /9
Ultimately, concerns around Section 230 stem from the proprietary control over walled gardens that each of the tech firms maintain.
At root, that’s an antitrust problem. Tinkering with 230 would “solve” the wrong problem. And antitrust presents better solutions. /10 https://twitter.com/doctorow/status/1305528065024389120
At root, that’s an antitrust problem. Tinkering with 230 would “solve” the wrong problem. And antitrust presents better solutions. /10 https://twitter.com/doctorow/status/1305528065024389120
Mandating interoperability could find justification in the essential facilities doctrine long established in antitrust law.
While it has fallen out of disfavor in the courts, Congress should restore the relevance of that doctrine by statutorily enshrining it. /11 https://twitter.com/shahidforchange/status/1313966388545900545
While it has fallen out of disfavor in the courts, Congress should restore the relevance of that doctrine by statutorily enshrining it. /11 https://twitter.com/shahidforchange/status/1313966388545900545
Beyond mandating interoperability and codifying the essential facilities doctrine, Congress could also create publicly accountable bodies to hear appeals from corporate content moderation decisions, or alternatively nationalize the behemoths outright. /12
The Section 230 liability shield regime protects platforms AND users, but still allows corporations to arbitrarily remove user content.
Companies shouldn’t be making content moderation decisions alone, and users deserve at least procedural rights. Public oversight can help. /13 https://twitter.com/jilliancyork/status/1348197486020730881
Companies shouldn’t be making content moderation decisions alone, and users deserve at least procedural rights. Public oversight can help. /13 https://twitter.com/jilliancyork/status/1348197486020730881