Meta Sued For $2B Over Violent Killing In Ethiopia

Facts

  • A group of researchers and activists have filed a lawsuit against Facebook's parent company, Meta Platforms Inc., in Kenya’s high court. The group accuses the company of amplifying hate speech and inciting violence in Ethiopia on the platform — they are also calling for about $2B in restitution.
  • The lawsuit contends that Facebook's recommendation systems amplified violent posts in Ethiopia — which has been in a state of civil war since Nov. 2022 — including several that preceded the murder of the father of Meareg Amare Abrha, one of the researchers who brought the case.
  • Abrham Meareg, the son of Tigrayan academic Amare Abrha, said his father was targeted by a series of threatening posts because of his Tigrayan ethnicity. The posts were reported to Facebook because they contained his father's address and called for his death.
  • Meareg's father was eventually murdered on Nov. 3, 2021, when a group of men followed him from his university on motorbikes and shot him twice in front of his home. The violent posts, which appeared on a page with 50K followers, weren't taken down by Facebook until long after his death.
  • Meta has responded to the suit, saying, "we employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali, and Tigrinya."
  • The case comes as Meta faces increasing criticism regarding its content moderation in countries afflicted by conflict and atrocities in places such as Myanmar, Sri Lanka, Indonesia, and Cambodia. The company has acknowledged that it was "too slow" to act in Myanmar and other countries.

Sources: Bloomberg, Al Jazeera, DW, NBC, Washington Post, and Reuters.

Narratives

  • Narrative A, as provided by BBC News. Meta, yet again, has blood on its hands — its lack of content moderation or sensible policy has led to the death of Meareg Amare Abrha and possibly many others. This is what happens when large Western companies fail to even attempt to understand the complexities of social conflict outside of their regions. The level of negligence Meta has shown in this case should be considered criminal, as its inability to stop the spread of violent rhetoric has directly led to increased violence and death.
  • Narrative B, as provided by Wired. Though Meta certainly has work to do regarding content moderation, this is a new frontier in human history and it will take time to refine a new system. Facebook flagged Ethiopia as being at “dire” risk of violence in 2021 and put its most effective resources towards fighting hate speech and violent rhetoric. Unfortunately, Ethiopian users didn't engage with hate speech in the same way users from other regions did, so Meta had to think on the fly to adjust to this difference.