Postponed: crunch ruling in our case against Facebook’s deadly algorithm is now set for April 7
Frustrating news from Nairobi. The ruling we were expecting today in our case challenging Facebook’s deadly algorithm has been postponed until April 7, 2025.
We’ll continue to fight – for as long as it takes. For all the updates on the case as they come in, hit the button below. And for an update on the case as it stands, keep reading.
Together with our legal partners at Nzili & Sumbi we launched the case in December 2022 and it challenges how Facebook fanned the flames of violence in Ethiopia.
The case is being brought by Abrham Meareg, Fisseha Tekle and the Katiba Institute.
Abhram’s father, Professor Meareg Amare Abrha, a chemistry professor at Bahir Dar University, was targeted with death threats and calls for violence against him in Facebook posts.
When Abrham saw the posts, and the inclusion of his father’s address, he knew they were a death sentence. He pleaded with Facebook to take them down. They didn’t.
Instead, Meta chose to promote them. Days later, his father was dead. He was murdered by men with machine guns outside his home.
Fisseha is bringing the case alongside Abrham. He is a former researcher at Amnesty International who published independent reports on violence by all sides in the Tigray conflict. He was targeted with death threats on Facebook, like Abrham’s father. Abrham and Fisseha are bringing their cases alongside the Katiba Institute, a legal organisation that works to defend the Kenyan constitution.
This case challenges Facebook’s algorithm which promotes hate speech and incitement to violence in users’ feeds. A platform which promotes violence and racial hatred to its users, with the goal of getting as many clicks as possible, is deadly by design.
Instead of engaging with the substance of the legal arguments, Meta has tried to sidestep it. It’s saying the Kenyan courts don’t have the power to hear the case. In legal terms, that’s called disputing the jurisdiction of the court. That’s the crucial issue the court will rule on tomorrow.
We think Meta’s argument is wrong. At the time of the threats against Abrham’s father, Facebook’s primary safety office for East and Southern Africa was based in Kenya. That’s why we say this case must be heard there.
This is the same content moderation hub at the centre of our case against the unlawful sacking of 185+ content moderators. Our cases against Meta in Kenya are directly linked.
Meta constantly fails to hire enough content moderators, or provide those it has, with safe working conditions. In other words, Facebook chooses not to invest in its key safety workers who might have been able to prevent the violence against Abrham’s father.
It has been a long road to get this far, and thanks to this morning’s postponement, it looks set to be a longer one still. But we’re going to keep fighting.
Foxglove is able to work on difficult, long term, legal cases like this thanks to donations from our supporters. If you’re one of those amazing donors, thank you so much for helping make this happen. And if you aren’t yet, please consider setting up a regular direct debit: