Why I’m so keen to talk to Facebook content moderators
Cori Crider
I’m a lawyer. Easy cases aren’t my thing. I’ve represented Guantánamo detainees and victims of torture against the US Defense Department and CIA. Dozens of my clients are free today. In one case, we forced Prime Minister Theresa May to apologise for the UK’s role in the rendition of dissidents to Libya. So I know what it takes for people without power to beat those who have it.
The law can be a powerful tool for those at the sharp end of global injustices, whose rights have been trampled by unaccountable power. The ‘war on terror’ still drives many such injustices.
But last year I decided to shift focus, to stand up to tech giants like Facebook and Google or government officials when they abuse their power though technology. I want to explain why, and why I’m now so keen to talk to more social media content moderators.
Over the past decade, I became increasingly concerned that digital technology was creating new centres of unaccountable power, and new ways for governments and big business to abuse their power. This power required a new organisation to challenge it. I helped found Foxglove to stop these abuses, and to stand up for a future where technology works for all of us—not just an elite.
The rights of tech workers are an important first focus for Foxglove. There’s a paradox at the heart of big tech companies. They enjoy huge power and wealth, which they want us to believe is down to their sophisticated use of computational power. But the slick technology can obscure an important truth. Just like their industrial forebears, the tech titans rely on mountains of human labour. And they have a tendency to exploit and mistreat this workforce.
In the public imagination, Big Tech employs an elite of highly paid coders and visionaries, based in fancy offices with silly names like “The Googleplex” or Facebook’s “Hacker Way.” This is a fraction of Big Tech’s real labour force. Without content moderators, our social media feeds would be unusable. Without cooks and drivers, Deliveroo, Just Eat, and UberEats’ appetising interfaces would serve us hot air. And without an army of warehouse packers, Amazon deliveries would be no faster than any old bookstore.
For most of these workers hours are long, pay is low, terms of employment precarious, and unions absent. The workers are rarely welcome at company HQ. They’re central to tech’s business model, but tech holds them at arm’s length – through spurious claims that they’re self-employed, or by using outsourcing companies. Social media’s factory floor is miles from Silicon Valley, with conditions a world away from those enjoyed by a Mark Zuckerberg or Jeff Bezos.
Foxglove’s first effort to challenge abuses of power in the workplace by Big Tech has been to support content moderators. Content moderators, whose work is still almost invisible, are part of the backbone of the modern internet. Every day they sift through some of worst content imaginable, so that the rest of us don’t have to. During the pandemic, we’ve depended on content moderators more than ever to keep those platforms relatively safe. They deal with everything from self-harm, to conspiracy theories, to adverts for ‘coronavirus parties’.
Platforms depend on content moderators, but you wouldn’t know it from the way they treat them. It’s not just the bad hours, toxic content, and low pay. Workers exist in a bubble of total surveillance and are ‘optimised’ within an inch of their very lives. Crushing targets are set for how much content you have to get through in a day, and a fanciful ‘quality’ score of 98% in some cases. Taking a ten minute breather after viewing a particularly disturbing piece of child abuse footage or a terrorist beheading video could get you penalised. The support to help staff cope with harrowing content is woefully inadequate – breathing exercises from “wellness coaches” contractually barred from giving real medical care. Unsurprisingly, content moderators can come to real harm, including long term mental health problems.
The Big Tech platforms wash their hands of those who fall ill. In many cases, they avoid responsibility by “outsourcing” moderation to a separate company. These outsourcing companies then do the bare minimum (or less) to avoid blame when anything goes wrong. In January, Foxglove learned that one of Facebook’s biggest outsourcing companies, Accenture, was asking moderators to sign an “acknowledgment” that the work could give them post-traumatic stress disorder.
In the US, Facebook was recently forced to accept some responsibility for its moderator workforce – agreeing to pay out $52 million to settle a US class action case. This settlement helps bust the myth that Facebook has nothing to do with what goes on in its content moderation sites. But Facebook has gone nowhere near far enough to make things right. I explained in this earlier article why $52 million is peanuts, and why the “reforms” to improve working conditions are insufficient.
Foxglove is supporting a separate case brought by Facebook moderators in Ireland. Ireland is Facebook’s European HQ and the ‘nerve center’ for all moderation policies for Europe, the Middle East, and Africa—a huge chunk of the world. This case, which has been joined by moderators from other European countries, is unaffected by the settlement in the US. In the Irish courts workers are entitled to individual assessments of their real psychological harm, and one would expect to see higher levels of damage on expert assessment.
But for Foxglove this isn’t just about payouts, or just about moderators in Ireland and the EU, or just about Facebook. We want to improve conditions for moderators whichever platform they work for, wherever they are in the world.
We’re already working with moderators in several European sites and in the US. We’re keen to hear from more, and also to start working with moderators on other platforms, and in other countries. If you’ve got first-hand experience as a moderator, we’d love to hear from you. You could help us better understand the conditions for moderators in your country, on the platform you worked for. We can explore whether there’s any potential for legal challenges which could benefit your own situation and send ripples round the world.
You can read more about how we’re working with content moderators below.