I’ve done two interviews in the past day with media about the Twitter/Facebook bans and the Parler shut down.
I’m trying to use these opportunities to stress something is getting lost in all the debates about the action: conversation about free expression is the wrong debate. 1/
I’m trying to use these opportunities to stress something is getting lost in all the debates about the action: conversation about free expression is the wrong debate. 1/
It’s pretty clear at this point the First Amendment doesn’t apply. The only people making that argument are craven politicians who are trying to gin up anger.
An expression debate focuses on the content. What I keep underlining is social media is a distribution platform too. 2/
An expression debate focuses on the content. What I keep underlining is social media is a distribution platform too. 2/
What this means is that when someone gets on and posts conspiracy theory content or seditious incitement, it isn’t a question of whether it will be distributed. The publishing platform has the tools for outreach and connection that let that content travel. 3/
Think about me in my basement printing up some nutty manifesto on my laser printer. I still have a problem of reach. How do I get that message out? I could walk the neighborhood or the city, but I’m limited by time and ability.
Working with others takes social connection. 4/
Working with others takes social connection. 4/
Publishing through a third-party allows for mass printing, but the real power is distribution. They have marketing, sales, shipping arms to make sure those things end up in the hands of others. 5/
All of that in legacy media requires relationships, contracts, etc. Just ask Josh Hawley about his book.
So while Twitter and Facebook did make publishing much easier, the real shift was that it completely blew up the barriers to distribution. 6/
So while Twitter and Facebook did make publishing much easier, the real shift was that it completely blew up the barriers to distribution. 6/
Free speech debates focus on the content. If you look at the statements from the social networks carefully, you can see their larger concerns are about how that material travels, how it flows from a single person to infect people who might not otherwise dabble in that stuff. 7/
And look, the companies tried. I think the efforts were weak, but it’s not like they didn’t do everything possible to avoid bans. They tried tags, fact checks, oversight boards, etc. They knew the problem was spread. What did conservatives do? They called it censorship anyhow. 8/
Whatever your view of the speech merits, if you’re a social media exec and seeing that your platform was being used to spread mountains of dangerous lies that led to an insurrection, that we were minutes away from members of Congress being captured …. feel the weight of that. 9/
The bans here were not an attempt to silence *conservatives*.
(Really, if you see the news that Twitter bans 70,000 white supremacist and QAnon accounts and think “Oh god they are banning conservatives!” then it might be good to reflect on what you think conservatism is.) 10/
(Really, if you see the news that Twitter bans 70,000 white supremacist and QAnon accounts and think “Oh god they are banning conservatives!” then it might be good to reflect on what you think conservatism is.) 10/
No, the bans are about limiting distribution of toxic material used to inciting violence. But more crucially, that material is a recruitment tool.
It would be like your local paper printing Nazi content but also providing a self-addressed envelope for you to join up. 11/
It would be like your local paper printing Nazi content but also providing a self-addressed envelope for you to join up. 11/
I want to stop here and highlight the excellent Oxygen Of Amplification report published in Data & Society a couple years ago.
http://bit.ly/2Q4w5rl
It goes much deeper on the radicalization and recruitment problem. Critical read, IMO. 12/
http://bit.ly/2Q4w5rl
It goes much deeper on the radicalization and recruitment problem. Critical read, IMO. 12/
The social networks slowly have realized they have a problem. They are creating the groups and havens that suck people in. Regular friend connections become exposure points to extremist content when that friend from high school surfaces stuff from QAnon. 13/
One way to think about this problem is that Facebook and Twitter are realizing content is only part of the problem. The deeper problem is the relationships it creates, how that social capital becomes a road by which dark information is traveling. 14/
The friendship part isn’t something these networks can easily solve. They can’t limit who you call a friend. They can sever the association on their platform, of course, but the impact of that is limited.
No, their main tools are to work on the content side and limit reach. 15/
No, their main tools are to work on the content side and limit reach. 15/
With any solutions, we need to think hard about the distribution side. This isn’t just a debate about the right to publish (in the case of private companies, again we are talking about philosophical and not legal rights).
One place to start is these companies’ scale. 16/
One place to start is these companies’ scale. 16/
We can’t break up Facebook like we did with AT&T. By definition the platform is hard to decentralize. The centralization is what makes it attractive for extremists recruiting others.
Maybe they shouldn’t have been allowed to buy Instagram and WhatsApp? Maybe? 17/
Maybe they shouldn’t have been allowed to buy Instagram and WhatsApp? Maybe? 17/
But we need more regulation of how information travels. Banning extremists and conspiracy theorists is a constant game of cat-and-mouse.
They started with famous people, because with tons of followers they are the ultimate vector origin point 18/
They started with famous people, because with tons of followers they are the ultimate vector origin point 18/
There is some indication that getting rid of the superspreader accounts helps. It forces extremist groups to reorganize. It cuts them off from a main group and creates splinter groups. This makes it harder to track them, but it also makes it harder for them to coordinate. 19/
What these groups have not had to do is operate on the run. It’s been going on largely in the background, with groups becoming origin points for people to start spreading this stuff on their own personal page. That’s what the networks are trying to attack here. 20/
So I think we need to look at the speech argument for what it is: a political tool being used by people with power to get people riled up. There certainly are speech policy issues in play here, but the threat is about distribution and recruitment by extremist groups. 21/21
@DemFromCT Thread maybe of interest to you ...