African workers are taking on Meta and the world should pay attention
A landmark case could see the US tech giant forced to take responsibility for workers’ rights violations not just in Kenya but globally.
-
Mercy Mutemi
Digital civil rights attorney based in Nairobi
Published On 1 Apr 20251 Apr 2025

In 2025, the world’s largest social media company, Meta, has taken a defiant new tone on the question of whether and to what extent it accepts responsibility for the real-world harm that its platforms enable.
This has been widely understood as a gambit to curry favour with President Donald Trump’s administration, and Meta CEO and founder Mark Zuckerberg all but said so in a January 7 video announcing the end of third-party fact-checking.
“We are going to work with President Trump to push back on governments around the world, going after American companies and pushing to censor more,” Zuckerberg said, giving his product decisions a distinct geopolitical flavour.
To justify the company’s decisions to do away with fact-checking and scale back content moderation on its platforms, Zuckerberg and Meta have appealed to the United States’ constitutional protection of the right to freedom of expression. Fortunately, for those of us living in the countries Meta has vowed to “push back on”, we have constitutions, too.
Advertisement
In Kenya, for example, where I represent a group of former Meta content moderators in a class-action lawsuit against the company, the post-independence constitution differs from those in the US and Western Europe with its explicit prioritisation of fundamental human rights and freedoms. The constitutions of a great many nations with colonial histories share this in common, a response to how these rights were violated when their peoples were first pressed into the global economy.
We are now beginning to see how these constitutions can be brought to bear in the global technology industry. In a landmark decision last September, the Kenyan Court of Appeal ruled that content moderators could bring their human rights violations case against Meta in the country’s labour courts.
Few in the West will have understood the importance of this ruling. Meta, for its part, surely does, which is why it fought against it tooth and nail in court and continues to use every diplomatic tool at its disposal to resist the content moderators’ demands for redress. Meta has shown interest in appealing this decision to the Supreme Court.
Meta and other major US companies maintain a convoluted corporate architecture to avoid exposure to taxes and regulation in the dozens of countries where they do business. They commonly claim not to operate in countries where they count millions of users and employ hundreds to refine their products. Until now, these claims have rarely been challenged in court.
The case content moderators have presented in court is that they were hired by a business process outsourcing (BPO) company called Sama, and put to work exclusively as content moderators on Facebook, Instagram, WhatsApp and Messenger during the period from 2019 to 2023, when much of the moderation for African content on these platforms was performed in Nairobi. Meta disavows these workers and insists they were employed solely by Sama, an issue currently being litigated before the courts in Kenya.
Advertisement
These workers know that Meta’s apparent reversal on content moderation is anything but. As presented in their grievances to the court, the company has never taken the issue seriously. Not seriously enough to stop the civil and ethnic conflicts, political violence, and mob attacks against marginalised communities that thrive on its platforms. Not seriously enough to pay fair wages to the people tasked with making sure it doesn’t. The harm travels both ways: Toxic content inflames real-world horrors, and those horrors engender more toxic content which saturates the platforms.
Content moderators are digital cannon fodder for Meta in a war against harmful content that the company was never really committed to fighting. The case presented by the Nairobi content moderators explains how they accepted jobs they thought would involve call centre and translation work. Instead, they ended up in Meta’s content moderation hub in Nairobi, where they spent their days subjected to an endless torrent of streamed violence and abuse.
Many of them were forced to view atrocities committed in their home countries in order to protect Meta’s users from the harms of seeing these images and footage. They absorbed that trauma so others in their communities didn’t have to, and many found this to be a noble calling.
But this work took its toll on their mental health. More than 140 former content moderators have been diagnosed with PTSD, depression, or anxiety arising from their time on the job. A separate case addresses how efforts to unionise to advocate for better mental healthcare were thwarted. What followed was en masse layoffs and relocation of Facebook content moderation elsewhere.
Advertisement
This left behind hundreds of trauma-impacted people and a trail of human rights violations. Meta argues that it never employed the Facebook content moderators and bore no responsibility to them. This litigation is ongoing, and the moderators now rely on the courts to unravel the complexities of their employment dynamics.
While fighting the case in court, in March 2024, the company sent a delegation led by its then-president of global affairs, Nick Clegg – a former British deputy prime minister – to meet with Kenyan President William Ruto and legislators to discuss, among other topics, the company’s vision of partnership with the government in bringing the “generative AI revolution” to the continent. At a townhall event in December, Ruto assured Sama, Meta’s former content moderation partner: “Now we have changed the law, so no one can ever take you to court again on any matter,” referring to a bill passed in Kenya’s parliament that shields Big Tech companies from future cases such as ours.
All this pushback occurred well before Trump was re-elected, and these efforts appear to be attempts to evade accountability for the company’s labour practices and the effects of its products. But something remarkable happened, which opens a door for others around the world who labour on behalf of the tech industry but whom the industry itself disavows: The court ruled that our case can proceed to trial.
The fact that the case has advanced despite vigorous legal and political challenges is a testament to the revolutionary nature of post-colonial constitutions, which prioritise human rights above all else.
Advertisement
As our case in Kenya continues, I hope it can offer inspiration for tech workers in other post-colonial nations that they too can pursue accountability in the countries where they have been harmed. The right to freedom of expression is an important human right, but we will continue to remind Big Tech that equally important are the right to dignity and freedom from exploitation.
The views expressed in this article are the author’s own and do not necessarily reflect Al Jazeera’s editorial stance.