What is a content moderator?: an FAQ
What is a Facebook content moderator?
Content moderators are essential workers in the day-to-day running of Facebook. It is their job to review content submitted by Facebook’s 2.8bn global userbase to its newsfeeds. This content comes to moderators’ computers in queues, or tickets, similar to work assignments in a call centre. Content moderators must assess each piece of content flagged to them and use policies from Facebook to decide whether it is allowed to remain on the platform.
How many content moderators are there?
Facebook has not revealed the full number, but the New York Times reported in 2021 that the figure was around 15,000.
Are content moderators only based in the US?
No. There are moderators in the Philippines, India, Ireland, Poland, Germany, Kenya and many other countries.
Are content moderators employed by Facebook?
Some moderators are directly employed by Facebook, but the majority are outsourced to companies like Accenture, Covalen, Sama and others. Outsourced moderators generally receive worse pay, working conditions and mental health support than directly-employed colleagues.
Is the work dangerous?
Yes. Moderators must look routinely at extremely distressing content including hate speech, graphic violence, sexual abuse, murder, violence against animals and sexual exploitation of children. Observing this kind of content regularly takes an incredible toll on moderators’ mental health and can lead to depression, anxiety, paranoia and even PTSD. The policies used to sort content also change constantly, making the job even more difficult and stressful.
What mental health support is provided by Facebook?
Facebook says it is the responsibility of its outsourcing companies to provide mental health support. Moderators have complained for years that the mental health support they receive is at best inconsistent and often very poor. Outsourcing companies like Accenture provide staff known as “wellness coaches” but many do not have proper medical training to provide sufficient mental health support and moderators say there are far too few of them.
Is the danger of the work reflected in their pay packets?
No. The NYT reported in 2021 that for every hour a US-based content moderator works, Facebook pays Accenture $50. For that same hour, the moderator is paid $18. Meanwhile, in Kenya, TIME magazine has revealed some moderators work for less than $2 per hour.
Why haven’t I heard of content moderators before?
By design. Without content moderators, Facebook would be overwhelmed with horrific content and collapse overnight. But the work is backbreaking and dangerous which is why Facebook outsources it to other companies, so it can claim that the horrific conditions aren’t a part of their operation. On top of that, outsourcing companies force moderators to sign non-disclosure agreements as a condition of employment, gagging them from talking about their experiences. It’s only through the bravery of moderators breaking NDAs that their stories are beginning to be told.