Last year, Facebook experts attempted to improve the platform’s algorithm so that the “worst of the worst” hate speech and racism would be automatically deleted.
According to internal documents obtained by The Washington Post in a Monday storey, firm leaders discarded their concept out of fear of upsetting “conservative partners.”
An internal team polled 10,000 Facebook users and showed them examples of hate speech, such as a user who captioned a photo of a chimp with the words “Here’s one of Michelle Obama.” Or snarky remarks about “The Squad,” a group of Democratic politicians that includes Reps. Ilhan Omar and Alexandria Ocasio-Cortez. According to the documents, one user labelled the group, which includes two Muslim women, “swami rag heads,” while another user branded them “black c—-s” in the comments section. According to the records, the corporation made sure to include self-identified White conservatives in the research to prevent objections from the company’s leadership, which has a history of making decisions to appease the right. According to the Washington Post, the team discovered after two years of conducting its “Worst of the Worst,” Project Wow, that the most vicious hate speech users voted on is almost always directed at minorities, and its algorithms were better at cracking down on comments that were harmful to White People but not to people of colour. Hate speech directed at Black, Jewish, LGBTQ, Muslim, and multiracial persons would be taken down by an automated system, according to the experts.
Top brass, including Joel Kaplan, VP of Global Public Policy, were concerned that the strategy would be interpreted as favouring certain vulnerable populations over others. Facebook’s right-wing partners would not agree to safeguard the group of minorities, according to a memo created for him, because “hate directed at trans individuals is an expression of opinion.” “The Worst of the Worst project helped show us what kinds of hate speech our technology was and wasn’t efficiently detecting and understand what forms of it people think to be the most insidious,” company spokesperson Andy Stone told The Washington Post.
He also stated that the business implemented elements of the initiative but not all of it since doing so would have resulted in “fewer automated hate speech deletions.” The report is another another example of employees raising concerns with management and presenting solutions to the problems at hand, only to be rejected for the sake of appearances, profit, or expansion.