1/4 A thread on why this article matters and how it shows (again) that Facebook is insincere about how it applies its own policies & whether politics and power preservation play a role. I saw it firsthand when working at Facebook and also wrote about it... https://www.washingtonpost.com/technology/2020/11/01/facebook-election-misinformation
2/4”Fear of appearing biased against Trump & other conservatives...has shaped everything from the algorithm deciding what appears in the News Feed to the process of reviewing potentially harmful content.” Hence our #1 recommendation here: cc: @kreissdaniel https://citap.unc.edu/additional-steps-platforms-can-take-to-protect-the-vote/
3/4 “Members of FB’s public policy team...floated a proposal...to escalate harmful posts...so that 50% would be conservative & 50% would be liberal, even if the material was not equivalent in potential risk.” Wrote the exact same example from my FB team https://www.wired.com/story/the-real-reason-tech-struggles-with-algorithmic-bias/
4/4 Bottom line: Facebook has made an intentional decision to allow certain political figures to violate its rules & spread disinformation, without consequence. And FB still operates with no accountability. Power & profit over democracy, period. cc: @lizzadwoskin @isaacstanbecker