Nathan Nkunzimana, filled with sorrow, reflected on the distressing memories of his time as a content moderator for a Facebook contractor. He watched alarming videos for eight hours daily, including child molestation and a woman’s murder. The weight of such horrors took its toll on Nkunzimana and his colleagues, leading some to express their anguish through screams and tears.
Today, Nkunzimana stands alongside nearly 200 former employees from Kenya, taking legal action against Facebook and their local contractor, Sama. Their objective is to shed light on the working conditions endured by content moderators, a case that holds potential implications for individuals in similar roles worldwide. This landmark court challenge marks the first of its kind outside the United States, where Facebook settled with moderators in 2020.
The individuals were employed at a content moderation hub in Nairobi, Kenya, an outsourced facility for Facebook. Their role involved screening user-generated content, such as posts, videos, and messages from across Africa, to ensure compliance with Facebook’s community standards and terms of service. They were responsible for identifying and removing any content that violated the platform’s guidelines or contained illegal or harmful material.
These moderators, hailing from various African countries, are now pursuing a compensation fund of $1.6 billion. They claim their working conditions were substandard, citing inadequate mental health support and low wages as significant concerns. Earlier this year, they were laid off by Sama, the contractor responsible for content moderation, as it ceased its involvement in this line of business. The moderators argue that Facebook and Sama have disregarded a court order to extend their contracts until the case resolution. Facebook and Sama have defended their employment practices in response to these allegations.
Impact of Facebook on Moderators Mental Health
The moderators find themselves grappling with uncertainty regarding the duration of the legal proceedings. As their financial resources dwindle and their work permits expire, they are burdened by the traumatic images that continue to haunt them, further exacerbating their despair. “If you feel comfortable browsing and going through the Facebook page, it is because there’s someone like me who has been there on that screen, checking, ‘Is this okay to be here?'”said Nkunzimana, a father of three from Burundi, during an interview with The Associated Press in Nairobi.
According to Nkunzimana, content moderation is akin to being a “soldier” who takes a bullet for Facebook users. As content moderators, they were exposed to distressing material depicting violence, suicide, and sexual assault, diligently ensuring its removal from the platform.

Initially, Nkunzimana and his colleagues took pride in their work, considering themselves “heroes to the community.” However, the constant exposure to such alarming content had a profound impact, particularly on those who had previously fled political or ethnic violence in their home countries. Despite their struggles, the moderators received little support and worked in a culture of secrecy.
They were required to sign nondisclosure agreements and were not permitted to bring personal items like phones to work. At the end of his shift, Nkunzimana would return home exhausted, often seeking solace by isolating himself in his bedroom, attempting to forget the disturbing images he had witnessed. Even his wife remained unaware of the true nature of his job.
Inadequate Support and Low Wages: Impact on Moderators
Nowadays, he confines himself to his room to avoid his children’s questions about why he is no longer employed, and they may no longer afford school fees. The salary for content moderators amounted to $429 per month, with a small expat allowance given to non-Kenyans in addition to that.
According to Nkunzimana, the U.S.-based Facebook contractor, Sama, made little effort to ensure that adequate professional counseling services were provided to the moderators in their Nairobi office. He claimed that the counselors were poorly trained to handle the psychological challenges faced by his colleagues. Consequently, he seeks solace in his church without proper mental health care.
Meta, the parent company of Facebook, has stated that its contractors are contractually obligated to pay their employees above the industry standard in their respective markets and provide on-site support through trained professionals. However, a spokesman for Meta declined to comment on the specific case in Kenya.
In an email to The Associated Press, Sama stated that the salaries offered to employees in Kenya were four times higher than the local minimum wage. They also highlighted that “over 60% of male employees and over 70% of female employees were living below the international poverty line (less than $1.90 a day)” before joining their organization.

Exploitative Content Moderation Practices in Kenya and Involvement of Facebook
Sama, the contractor involved, stated that all employees had unlimited access to one-on-one counseling without fear of repercussions. They deemed a recent court decision to extend the moderators’ contracts as “confusing,” citing a subsequent ruling that paused the implementation of the extension.
The nature of content moderation work can have severe psychological impacts. However, individuals in lower-income countries may be willing to take the risk due to the prospects of office jobs in the tech industry, according to Sarah Roberts, an expert in content moderation at the University of California, Los Angeles. She highlighted that outsourcing sensitive work in countries like Kenya is rooted in exploitative practices, taking advantage of global economic disparities while evading responsibility through third-party arrangements.
Roberts also expressed concerns about the quality of mental health care and raised issues regarding therapy confidentiality. She noted that the Kenya court case stands out because the moderators organized and pushed back against their working conditions, thereby gaining unusual visibility. In the United States, settlements are the common resolution tactic in such cases, but this may not be as straightforward if similar cases emerge in other jurisdictions.
Following accusations of allowing hate speech to circulate in countries like Ethiopia and Myanmar, Facebook established moderation hubs worldwide. These hubs recruited content moderators fluent in various African languages. However, moderators hired by Sama in Kenya were consistently exposed to graphic content closely tied to their experiences, such as the war in Ethiopia’s Tigray region. For individuals like Fasica Gebrekidan, who worked as a moderator for two years, the content they reviewed often depicted the brutalities of the conflict, including rape. They had to watch specific portions of videos to determine if they should be removed.
The Impact of Inadequate Support: Psychological Trauma and Uncertainty
The initial sense of gratitude Fasica had upon securing the content moderation job quickly vanished. The work became a form of torture, constantly exposing her to the horrors she had fled. Now, she finds herself without income or a permanent home, yearning to regain a sense of normalcy. As a former journalist, she can no longer write for emotional release.
Fasica fears that the traumatic experiences will forever linger in her mind. During the conversation, her gaze fixated on a painting across the cafe—a deep red piece depicting a distressed figure, which unsettled her. She holds Facebook responsible for the lack of adequate mental health care and fair compensation, blaming the local contractor for exploiting her and ultimately letting her go. Fasica believes Facebook should know the situation and genuinely care about content moderators’ well-being like her.
The moderators’ complaint now rests in the hands of the Kenyan court, with the next hearing scheduled for July 10th. The ongoing uncertainty frustrates Fasica, as some of her colleagues have already given up and returned to their home countries, although that is not currently an option for her.