The real threat to Section 230's future comes not from people who are angry about being censored, but from people who are angry that platforms were weaponized and did not do enough to reduce the harms.
And that's a debate that we need to have. I think there are a lot of challenges in addressing these problems by simply changing Section 230, but it's reasonable to examine how the liability framework affects the distribution of misinformation and other harmful content.
But to have this debate, we have to operate off of a factual record, which has largely been lacking in the debate around 230 for the past few years. What types of moderation are possible? How does it affect other speech? What do platforms do in response to changes to 230?
What protections exist for platforms under 1A/common law without 230? Can federal criminal laws (which are exempt from 230) be changed for better accountability? How do we prevent the dominant platforms from becoming more dominant?
You can follow @jkosseff.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.