I respectfully disagree with my @Wired colleague @GiladEdelman. More content moderators just mean more bad moderation. The very concept of content moderation assumes a simplistic model of content and ignores cultural and linguistic dynamics and diversity. https://www.wired.com/story/stop-saying-facebook-too-big-to-moderate/?utm_source=twitter&utm_medium=social&utm_campaign=onsite-share&utm_brand=wired&utm_social-type=earned
The problem is not a lack of resources. It's in the breadth of Facebook's users -- 2.7 BILLION people constantly uploading video, images, sound, and text in more than 110 languages.
Facebook's biggest market, India, has 300 million FB users posting in more than a dozen languages.
The very idea that FB could hire enough people to moderate properly even in India is absurd. That's because it's not about numbers of people. It's about the job itself.
In any language like Malayalam (45 million speakers, mostly in Kerala) the range of expressions that could violate FB policies changes rapidly. Every language changes rapidly. Every language has cultural cues for hate speech or violence that the uninitiated won't catch.
It's also not about money because finding people to enforce Facebook's rules OVER, say, the wishes of Muslim extremists in Pakistan or Buddhist extremists in Myanmar or Sri Lanka is impossible. Doing that job is dangerous.
Once again we see these arguments play out within the narrow range of American concerns like the Pandemic video or Donald Trump's posts. That misrepresents how Facebook actually works in the world.
That said, content moderation is but one sliver of the problem that Facebook presents to the world. That's why I had to write a whole book about it.