A global alliance of child protection advocates and professionals has warned Mark Zuckerberg that Facebook has lost the trust of parents, is prioritizing financial gain over children’s interests, and must take steps to restore trust in its platforms.
In a letter signed by 59 organizations, including the National Society for the Prevention of Cruelty to Children in the United Kingdom and the Child Rescue Coalition in the United Kingdom, Facebook’s founder and CEO is urged to publish internal assessments of the risks young people face on its services.
“The company must do significantly better to regain the trust of parents and child protection professionals, and most importantly, to ensure its product decisions contribute to rather than compromise children’s safety and wellbeing,” said the letter.
Zuckerberg is being encouraged to take five steps to address concerns about his company’s approach to protecting minors on its eponymous social media site, Instagram, and WhatsApp messaging service. These are the steps:
- Share all its internal research on the impact its platforms have on children’s wellbeing.
- Set out what research has been conducted on how the company’s services contribute to child sexual abuse.
- Publish risk assessments of how its platforms affect children.
- Provide details of an internal reputational review of its products.
- Review the child protection implications of encrypted messaging.
In testimony to US senators, Facebook whistleblower Frances Haugen accused the firm of a careless approach to safety, and a number of document dumps that formed the backbone of a series of devastating articles in the Wall Street Journal.
Internal Instagram study detailing the app’s influence on teen wellbeing was one of the most devastating disclosures, with one presentation indicating that 30% of young girls thought Instagram made their body dissatisfaction worse.
The group letter echoes Haugen’s accusation that the firm prioritizes profit over people:
“We cannot continue with a situation in which children’s needs are or appear to be secondary to commercial motivations, and in which young people’s right to safety, privacy and wellbeing is traded-off to prioritise the interests of adults and other more influential drivers.”
A spokesperson for Facebook said: “‘We’re committed to keeping young people who use our platform safe. We’ve spent $13bn [£9.5bn] on safety in recent years – including developing tools to enhance the safety and wellbeing of young people across Facebook and Instagram. We’ve shared more information with researchers and academics than any other platform and we will find ways to allow external researchers more access to our data in a way that respects people’s privacy.”
The letter was issued as hearings on the UK’s proposed internet safety law resumed, which would require social media sites to safeguard children from harmful information and prevent the spread of criminal content like child pornography. Laura Edelson, a social media researcher at New York University, told MPs and peers on the committee that Facebook’s algorithms pushed vulnerable users towards more hazardous information because it was so engaging.
“Attention is what these platforms sell,” said Edelson. “They sell advertising. So this is their business model, to engage users and they have to build algorithms to do that and … certain types of harmful content are just more engaging.”
Another witness, Guillaume Chaslot of the AlgoTransparency movement, stated that social media and video-sharing corporations should be penalized based on how widely harmful video content is viewed. Platforms would have an incentive to “react quickly” to risky content and ensure that their algorithms do not propose inappropriate content under such a regime.