
Apple earlier said that it will analyse images uploaded to the iCloud by American users using the neuralMatch software and will contact the National Center for Missing and Exploited Children (NCMEC) if child abuse material is identified.
Edward Snowden has joined the anti-Apple campaign, signing a petition opposing the company’s proposal to scan all iPhone users’ photographs. The whistleblower, who was forced to depart the United States after releasing highly sensitive information on worldwide surveillance networks, backed the open letter, which was published on the Github platform, and expressed his concerns about Apple’s “privacy-invasive content scanning technology.”
If you have a @github account, you can join me in co-signing the first letter uniting security & privacy experts, researchers, professors, policy advocates, and consumers against @Apple's planned moves against all of our privacy.https://t.co/QIb1TwJE0C
— Edward Snowden (@Snowden) August 6, 2021
He expressed alarm in a series of tweets that Apple is implementing “mass surveillance to the entire world” and setting a precedent that might allow the corporation to scan for any other arbitrary content in the future.
Despite being ordered by the FBI and a federal judge to unlock an iPhone used by Syed Farook, one of the gunmen in the December 2015 attacks in San Bernardino, California, Snowden observed that Apple has historically been an industry leader in terms of digital privacy. Apple was against the decree, claiming it would create a “dangerous precedent.”
No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.
They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk
— Edward Snowden (@Snowden) August 6, 2021
A prominent worldwide digital rights group, the Electronic Frontier Foundation (EFF), has slammed Apple’s move to check its customers’ libraries and communications in iCloud, calling it “shocking” for consumers who have depended on Apple’s privacy and security leadership.
“Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor…
It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”
Apple’s decision to scan communications and iCloud Photos could be legally obliged to include additional materials or readily expanded. As the EFF noted, “Make no mistake: this is not a benefit for all iCloud Photos users, but a reduction in privacy.”