
By signing an open letter, nearly 5,000 groups and individuals are urging Apple to reconsider its plans to implement photo scanning to identify child sexual abuse material.
The letter said, “While child abuse is a serious problem, and efforts to combat it are almost undeniably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.”
Recently, Apple announced the addition of expanded child protections to iOS and macOS. Photo storage and sharing on these devices is continuously monitored by this system’s technology. Family Sharing users who receive sexually explicit messages via the Messaging app will be alerted as soon as this feature is deployed. Apple’s technologies will also allow it to scan iCloud for Child Sexual Abuse Material (CSAM).
Apple said, “Before an image is stored in iCloud Photos, an on-device matching process for that image is performed against a known CSAM hash.”
As a result, organisations and people have begun to voice their worries about the new system’s potential impact on the privacy of base users. On GitHub, an open letter was released with a large number of signatories and statements.
As the Electronic Frontier Foundation pointed out in a blog post published on Thursday, the update can be used to snoop on Apple users. On the website appleprivacyletter.com, excerpts from that blog were posted.
On Friday, the letter was sent out, and by Sunday morning, approximately 5,000 individuals and organisations had signed it. Members of the Freedom of the Press Foundation, where NSA whistleblower Edward Snowden serves as chairman and board member, are among the co-signers.
“It doesn’t matter how well Apple is monitoring the whole world at large,” Snowden said. Twitter. “Make no mistake: If they can scan for kiddie porn today, they can scan anything tomorrow. They turned a trillion dollar devices into iNarcs—*without asking.*”
Some of the people who signed the letter claimed to be Apple employees, either present or former.
On Friday, WhatsApp’s chief executive, Will Cathcart, said the company is concerned about Apple’s choice to scan the photographs. In a Twitter thread, he said, “I think this is the wrong approach and a damage to people’s privacy around the world.”
Tim Sweeney, CEO of Epic Games, said he “worked hard” to envision the move through Apple’s eyes. “But effectively, Apple established government spyware based on a notion of guilt,” he added.
Sweeney’s firm previously chastised Apple in a high-profile legal battle over App Store guidelines.