The top consulting firm Accenture is questioning its large deal with Facebook, which pays it to clean the site of pornography, violence, suicides, and other poisonous material.
Thousands of staff and contractors spend eight hours a day evaluating hundreds of videos, photographs, and postings for the social media giant to prevent them from spreading online, according to a New York Times piece published on Aug. 31.
Employees became sad, worried, and paranoid, and one filed a class action lawsuit to protest the working circumstances, which included witnessing rapes, animal cruelty, dead bodies, and other gruesome events from the Syrian conflict.
Top officials have raised concerns about whether Accenture should continue the job, but the subject is unresolved, according to the New York Times, and the lucrative contract continues.
The initiative is part of Facebook CEO Mark Zuckerberg’s vow to clean up the company following widespread criticism. It employs artificial intelligence to delete around 90% of the posts, but outsources the rest of the work to at least ten consulting firms.
According to the New York Times, one of these companies, TaskUs, now derives a third of its revenue — $150 million per year – from Facebook.
Accenture was previously a division of Arthur Andersen, which was forced to separate its consulting division in 2000. In 2001, Andersen Consulting, as it was then known, rebranded as Accenture.
According to a Times investigation, the company mostly provides accounting and technology advice, but it also gets $500 million every year from Facebook and employs one-third of the people who moderate its material.
Accenture obtained an accounting contract with Facebook in 2010, and two years later, the agreement was expanded to include content moderation.
“Their contracts, which have not previously been reported, have redefined the traditional boundaries of an outsourcing relationship,” write Times reporters Adam Satariano and Mike Isaac, who interviewed more than 40 current and former Accenture and Facebook employees, labor lawyers and others. “Accenture has absorbed the worst facets of moderating content and made Facebook’s content issues its own. As a cost of doing business, it has dealt with workers’ mental health issues from reviewing the posts. It has grappled with labor activism when those workers pushed for more pay and benefits. And it has silently borne public scrutiny when they have spoken out against the work.”
Meanwhile, top executives have joined workers in challenging the contract’s ethics, with former CEO Pierre Nanterme (who died in 2019) and current CEO Julie Sweet ordering a review and dispatching observers to monitor managers and staff at work.
Sweet made a few tweaks, including a lengthy legal disclaimer indicating the work has the potential to harm your emotional or mental health.
In its annual report from last year, Accenture identified content filtering as a business risk.
Executives from Facebook and Accenture did not respond to requests for comment, while an Accenture representative told the New York Times that the effort was “essential to protecting our society by keeping the internet safe.”
Accenture now modifies material for YouTube, Twitter, Pinterest, and other social media platforms.