The violence in Washington didn't happen overnight. It was culmination of a four-year saga of lies - mostly played out on @Facebook, @Twitter and @YouTube - that politicians, Big Tech and, importantly, all of us are to blame for https://www.politico.eu/article/us-capitol-hill-riots-lay-bare-whats-wrong-social-media-donald-trump-facebook-twitter/
Here's how we got here:
Here's how we got here:
We gotta start where this all began: the 2016 US presidential election which saw widespread Russian interference, using a variety of social media dirty tricks in an attempt to influence voters https://www.justice.gov/storage/report.pdf
But it didn't take long for other groups to latch onto these tactics, using them for their own partisan goals. First up: US alt-right influencers pushing leaks from @EmmanuelMacron's 2017 presidential campaign (leaks, ironically, that came from Russia) https://www.nytimes.com/2017/05/06/world/europe/emmanuel-macron-hack-french-election-marine-le-pen.html
That plan didn't pay off. But it signaled for many domestic groups (in US and elsewhere) that Russia's disinformation playbook was extremely useful to sow division, peddle online falsehoods & potentially sway elections.
It was a sign of things to come.
It was a sign of things to come.
Fast forward to later in 2017 when German far-right communities were bombarded by (likely Russian-backed) disinformation ahead of the country's federal election. The goal: to get them to the polls, and it worked https://www.politico.eu/article/far-right-german-voters-more-likely-to-believe-fake-news-study-says/
This became increasingly common: the blurring of domestic & foreign groups in peddling falsehoods, making it almost impossible to know who was behind the campaigns and how to combat them -- there are obviously freedom of expression concerns about limiting political speech
But what started in the US, and then expanded to France and Germany didn't stop there. In Italy's 2018 election, disinformation ran rife, often promoted by domestic politicians. That left social media companies in a bind in how to respond https://www.bbc.co.uk/news/world-europe-43214136
But things only became more murky. Ahead of the 2018 Irish abortion referendum, US groups bombared the country w/ @Facebook political ads, trying to tilt the scales in favor of pro-life camp https://www.politico.eu/article/foreign-groups-invade-ireland-online-abortion-referendum-debate-facebook-social-media/
That finally woke social media companies (two years after 2016 US vote) to the potential harms of political ads, leading them to ban foreign groups from buying such digital messages linked to elections/social issues.
Yet the political ads stuff was becoming old news. In Sweden's 2018 election, the govt prepared for foreign interference -- only to find that the disinformation came from domestic groups. Awks. https://www.bloomberg.com/opinion/articles/2018-11-15/fake-news-roiled-sweden-s-elections-but-it-was-homegrown
The 2018 US mid-terms also showed how much things had changed. While foreign interference was on everyone's minds, rightwing groups had spent the last 2 years building up a significant digital ecoystem that often mimicked Russia's disinformation playbook
That included the promotion of highly partisan news outlets like The Western Journal as a way to sidestep traditional media outlets to create partisan echo chambers HT: @bydanielvictor https://www.nytimes.com/2019/08/22/us/western-journal-highlights.html
It also included far-right influencers like Charlie Kirk, Dan Bongino and others amassing millions of followers, many of whom had been fed diet of disinformation and partisan attacks that undermined their trust in decades-old institutions. (This was right out of Russian playbook)
Complicating matters further was the evolving tactics. Disinformation started to appear on @WhatsApp, the encrypted messenger, and it was almost impossible to fact-check or delete such claims
It became a massive problem ahead of Brazil's 2018 election ( https://www.reuters.com/article/us-brazil-election-whatsapp-explainer-idUSKCN1MU0UP), then moved on to India's 2019 election ( https://www.theatlantic.com/international/archive/2019/04/india-misinform) and now is engrained into anywhere ppl use WhatsApp ( https://www.politico.eu/article/the-coronavirus-covid19-fake-news-pandemic-sweeping-whatsapp-misinformation/)
Amid this crisis, politicians did almost nothing to combat these problems. In the UK, officials had been warning for years that disinformation was a problem, but London did absolutely nothing ahead of country's 2019 election https://www.politico.eu/article/uk-general-election-facebook-misinformation-boris-johnson-interference-russia/
UK wasn't the only one. As politicos railed against Big Tech to "do something" about disinformation, politicians squabbled over if social media was biased against conservative voices or whether it should be left to companies to regulate themselves.
The companies, too, were extremely slow to act, often hiding behind arguments of freedom of expression. And when they did act, it was a piecemeal approach, often leaving major gaps in their defenses https://www.politico.com/news/2020/03/06/stealth-political-ads-flourish-on-facebook-122539
Soz, day job came calling. Where was I? Oh yes, the companies. In part, they outsourced this to third-party fact-checkers to do the heavy lifting on deciding what was, and what wasn't, ok. That system is broken https://www.politico.eu/article/coronavirus-fake-news-fact-checkers-google-facebook-germany-spain-bosnia-brazil-united-states/
The also relied on armies of contract content moderators who had to sift through the most vile of content (not just disinformation). As you can imagine, that did not go well HT: @CaseyNewton https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona
But when social media companies tried to rely on automation instead, that too led to major problems. The COVID-19 pandemic offered a real-world test case on how this would work. It did not go well https://www.politico.eu/article/facebook-content-moderation-automation/
Amid these problems, govts worldwide focused almost entirely on foreign influence, mostly because they still viewed disinformation through a 2016 lens. Problem is: they aren't very good at catching this stuff https://www.politico.eu/article/misinformation-disinformation-uk-general-election-canada-facebook/
In part, that's because it's now very hard to say what is foreign and what is domestic. The playbook is now the same for both, and unless you have categoric proof of where something comes from, it's anyone's guess, really.
So that takes us to the COVID-19 crisis -- a crucible in which extremist groups, populist politicians and conspiracy theorists have run amock in pushing falsehoods across social media. https://www.politico.com/news/2020/05/12/trans-atlantic-conspiracy-coronavirus-251325
2020 offered a real-world test for almost 4 years of trial-and-error on disinformation tactics. Fake websites, check. Coordinated activity, check. Doctored images, check. Real-world harm, double check.
Into that world, step the 2020 US election. Big Tech had finally woken up to the threat, and has arguably done a good job at limiting foreign interference. But they were still woefully outgunned ahead of November's vote https://www.politico.com/news/2020/08/04/silicon-valley-election-misinformation-383092
In part, that's because domestic groups -- those who had used the last 4 years to try out new tactics overseas https://www.politico.eu/article/us-nationalists-far-right-europe-elections-digital-facebook/ -- were now the main promoters of disinformation, and companies were very unwilling to take meaningful action https://www.politico.eu/article/russia-is-back-wilier-than-ever-and-its-not-alone/
This was the culmination of years of quasi-planned disinformation coordination, fueled by Trump's lies of voter fraud, a sophisticated right-wing influencer online network and a US media landscape in which much of the country didn't trust what they read outside their bubbles
So what happened after Biden's victory? Well, that well-oiled machine kicked into gear, promoting hashtags like #StopTheSteal and others to coordinate online and, increasingly, offline https://www.politico.com/news/2020/11/04/maga-trump-claims-voter-fraud-434099
It led to clashes in Arizona (HT: @tina_nguyen https://www.politico.com/news/2020/11/05/sharpie-ballots-trump-strategy-arizona-434372), as well as increasingly violent talk in fringe social networks that did little, if anything, to stop disinformation from spreading https://www.politico.com/news/2020/11/13/extremists-fringe-social-media-election-fraud-436369
And, to bring this full circle, that's what led to Wednesday's violent clashes. This didn't happen overnight; it didn't just appear from nowhere; it isn't just one person's fault for why this has got ugly, fast.
And, even now, US extremist groups are using the Capitol Hill violence as a rallying cry for even more online/offline action -- ironically, going beyond rightwing influencers who have now lost control (if they ever had any) of the movement
. @BostonJoan put it better than I could. the riot is “hashtags come to life" -- the embodiment of a four-year downward cycle that has laid bare the problems w/ social media (HT: @tina_nguyen) https://www.politico.com/news/2021/01/07/right-wing-extremism-capitol-hill-insurrection-456184
Man, that got dark, apologies. To make up for that, I give you a puppy in a letter box.
Rant over. Thoughts appreciated.