Meta Contractor Dismissed Threats to Moderators by Ethiopian Rebels
Recent court documents reveal that a contractor hired by Meta, the parent company of Facebook, dismissed threats made against content moderators by members of the Oromo Liberation Army (OLA), a rebel group in Ethiopia. This information comes to light amid a legal case involving the dismissal of dozens of moderators in Kenya, who allege they were fired for attempting to organize a union.
In 2022, 185 content moderators filed a lawsuit against Meta and two contractors, including Sama, a Kenya-based firm responsible for moderating Facebook content. The moderators claimed they lost their jobs after trying to unionize and were subsequently blacklisted from applying for similar roles at another contractor, Majorel, after Meta changed its content moderation partners.
Moderators focusing on Ethiopia reported being targeted by the OLA for removing their videos from the platform. According to court documents filed on December 4 by Foxglove, a British non-profit organization supporting the moderators, Sama initially dismissed the moderators' complaints about the threats, accusing them of fabricating the messages. However, they later agreed to investigate the matter and even relocated one moderator who had been publicly identified by the rebels to a safehouse.
One moderator recounted receiving a message from the OLA that threatened "content moderators who were constantly pulling down their graphic Facebook posts." The message warned them to cease removing their content or face severe consequences. Despite these threats, the moderator's supervisor reportedly dismissed his concerns.
Another moderator shared that he received a message from the OLA that included his and his colleagues' names and addresses, leading him to live in fear of visiting family members in Ethiopia. The situation highlights the dangerous environment in which these moderators operate, particularly in regions affected by conflict.
The Ethiopian government, particularly in the Oromiya region, has accused the OLA of committing violence against civilians amid ongoing conflicts, which have escalated following failed peace talks in 2023. The court documents also indicate that Meta ignored recommendations from experts hired to address hate speech in Ethiopia. One expert expressed frustration over being trapped in an "endless loop" of reviewing hateful content that did not violate Meta's policies, thus preventing moderators from taking necessary action.
Efforts to reach an out-of-court settlement between the moderators and Meta collapsed in October 2022, leaving the case to proceed in court. The outcome of this case could have significant implications for how Meta collaborates with content moderators globally, especially in conflict-affected regions.
The OLA, which emerged as a splinter group from a previously banned opposition party, has grievances rooted in the perceived marginalization of Ethiopia's Oromo community. In a separate case filed in Kenya in 2022, Meta faced accusations of allowing violent and hateful posts from Ethiopia to proliferate on Facebook, exacerbating the civil war between the federal government and Tigrayan authorities.
As the legal proceedings continue, the situation underscores the challenges faced by content moderators in volatile environments and raises questions about the responsibilities of social media platforms in managing harmful content while ensuring the safety of their workers.