In a recent report by The Guardian, Kenyan content moderators who worked for OpenAI’s ChatGPT shared their distressing experiences. These moderators were employed by a California-based data-annotation company called Sama, which had a contract with OpenAI and provided data-labelling services to tech giants like Google and Microsoft. However, Sama ended its collaboration with OpenAI in February 2022 due to concerns about dealing with potentially illegal content for AI training.
One of the moderators, Mophat Okinyi, bravely spoke out about the challenges he faced while reviewing content for OpenAI. He revealed that he had to read up to 700 text passages every day, a staggering workload for anyone. Sadly, many of these passages revolved around explicit and graphic themes, particularly sexual violence. This constant exposure to such disturbing content took a toll on Mophat’s mental well-being, leaving him feeling paranoid and anxious about those around him.
The distressing nature of the work also deeply affected his personal life. Mophat disclosed that his relationship with his family suffered as a result of the emotional burden he carried from his job. The graphic and disturbing content he encountered on a daily basis haunted him even outside of working hours, making it challenging for him to find solace and peace at home.
Addressing the Human Cost: Prioritizing Moderator Well-being in AI Content Moderation
The story of Mophat and other Kenyan moderators highlights the importance of ensuring proper support and care for content reviewers who are exposed to such distressing material. The impact of this work on their mental health should not be underestimated, and it is crucial for companies like OpenAI to provide comprehensive assistance and resources to help these moderators cope with the emotional toll of their responsibilities.
The report sheds light on the potential risks and ethical concerns involved in training AI models using sensitive and explicit content. As technology continues to advance, it is essential for companies to consider the well-being of their workforce and take appropriate measures to protect their mental health.
The revelations made by these moderators have raised important questions about the content moderation industry’s practices and the responsibility of companies like OpenAI and Sama in safeguarding their workers’ mental and emotional well-being. It serves as a reminder that while AI development is vital for progress, the human toll behind the scenes should not be ignored or overlooked.
Alex Kairu, another former moderator, revealed to the news outlet that the experiences he encountered on the job “destroyed me completely.” He expressed how the role caused him to become more introverted and lamented the deterioration of his physical relationship with his wife. Despite a request for comment, representatives from OpenAI and Sama did not respond immediately. The Guardian was not provided with any statement from OpenAI.
Working Conditions and Compensation Controversy
According to The Guardian, moderators raised concerns about disturbing content they had to review, including violence, child abuse, bestiality, murder, and sexual abuse. They claimed that during the contract between OpenAI and Sama, they were paid meager wages ranging from $1.46 to $3.74 per hour. Additionally, Time previously reported that data labelers were paid less than $2 an hour to review content for OpenAI.
Now, four of the moderators are urging the Kenyan government to investigate the working conditions during the contract period. They expressed that they didn’t receive sufficient support for their work. Sama disagreed with this and informed The Guardian that workers had access to therapists around the clock and received other medical benefits.
“We are in agreement with those who call for fair and just employment, as it aligns with our mission,” stated a spokesperson from Sama during their interview with the news outlet. They further added, ” believe that we would already be compliant with any legislation or requirements that may be enacted in this space.”