Kenya: Moderators Sue Meta for Alleged Psychological Distress
Facts
- Dozens of Facebook content moderators in Kenya are suing Meta and outsourcing company Samasource Kenya over allegations of human trafficking, modern slavery, unfair labor practices, and intentional infliction of mental harm.[1][2]
- As part of the suit, over 140 of the content moderators have been diagnosed with severe post-traumatic stress disorder after reviewing content that allegedly included murders, suicides, and child abuse, from 2019 to 2023.[2][3]
- The moderators, who worked 8- to 10-hour shifts at the Samasource facility in Nairobi, were allegedly paid $1.50 per hour — eight times less than their US counterparts — while screening content in multiple African languages.[3][4]
- Medical reports filed with the Kenyan employment court revealed that all 144 examined moderators developed PTSD, generalized anxiety disorder, and major depressive disorder, with 81% showing severe symptoms.[3]
- At least 40 moderators reported substance abuse issues, while others said they experienced marriage breakdowns and social isolation due to their work.[3]
- This comes after a Kenyan court ruled that Meta could be sued in the country, rejecting the company's argument that it had no registered office in Kenya. On Monday, Kenya's Pres. William Ruto said he would sign a bill into law to prevent outsourcing companies from being sued in Kenya in the future.[4][5]
Sources: [1]The Guardian (a), [2]Newscentral Africa, [3]The Guardian (b), [4]Time and [5]Washington Post.
Narratives
- Narrative A, as provided by Time and Africa News. The content moderation system exploits vulnerable workers in developing countries, subjecting them to severe psychological trauma while paying them a fraction of what US moderators earn, all while denying them adequate mental health support and attempting to evade responsibility through outsourcing. All Big Tech companies should wake up and address the human rights violations taking place along their value chains.
- Narrative B, as provided by Meta and TechCrunch. Meta takes the support of content reviewers seriously. It provides counseling, training, and round-the-clock support through third-party contracts. It ensures its employees are paid above local industry standards and implements technical measures like blurring and muting to limit exposure to graphic content. Though it recognizes this is challenging work for data annotators, Meta has always followed Kenyan law and upheld its ethical and wellness standards.