Apple is all set to release a technology that will scan the user files to spot child sexual abuse material, following which the case will be reported to authorities concerned. Although the company has been making efforts to ensure the users about privacy and safety, a good majority aren’t very convinced with the claims. The technology which goes by the name NeuralHash will first be launched in the US in the next month.
What is NeuralHash?
In a society where technology has taken the forefront, it becomes easier to engage in exploitative activities. Alen Turing indeed did a favor to society with his discovery, but it is not free of flows. Thus it becomes crucial that steps are taken to ensure safety and prevent exploitation. The latest technology that is all set to launch is aimed at the protection of children from probable online harm and exploitation. The technology will provide filters that will create a barrier against photos that are sexually explicit. Another feature will also stop the users when they search through Siri and Search for terms that are related to CSAM.
It is a known fact that a good number of cloud services already have a feature in place that scans user filers for illegal content or those that are against the terms of services. However, Apple had long kept away from scanning users’ files. This was by providing the users with the option to encrypt their data. Looks like that resolution has come to an end with NeuralHash.
The news about NeuralHash surfaced on Wednesday, thanks to a couple of tweets by Matthew Green, a cryptography professor at John Hopkins University. (Source: TechCrunch).
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
— Matthew Green (@matthew_d_green) August 5, 2021
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
— Matthew Green (@matthew_d_green) August 5, 2021
The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over. Here’s an open letter signed by former AG William Barr and other western governments. https://t.co/mKdAlaDSts
— Matthew Green (@matthew_d_green) August 5, 2021
The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy.
But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
— Matthew Green (@matthew_d_green) August 5, 2021
Ever since the news was out, many users have been expressions concerns about privacy and safety. Not a surprising response gave the society we are living in where a single crack can give way to massive impacts. Although the technology offers a certain level of promise when it comes to the protection of child safety, one cannot help but wonder what will happen if these files fall into the hands of an authoritarian government. Then, there is every possibility that what was created to prevent exploitation will end up being the tool of large scale exploitation.
People are primarily uncomfortable with the idea of looming surveillance by an algorithm that sounds rather disruptive. And it is quite reasonable that experts are asking for a more public discussion before the app is launched. Another haunting question is the timing of the technology. Why all of sudden the company decided to administer such surveillance? Most of the users cannot help but wonder if the company is giving in to the coercion by the US government. It has been a while since news about the government pressurizing the company to loosen its encryption has been heard around. Although the ultimate goal is noble- to facilitate serious crime investigation- the concerns still persist about the multiple ways in which this can be misused. Here are a few reactions and responses from Twitter following the news.
I tried to sort through my feelings about Apple’s child safety announcements today. The risk of the slippery slope is real — but there are also real harms taking place on iCloud today, and it’s good that Apple has chosen to pay attention to them https://t.co/0N0qTlWn1I pic.twitter.com/puF0sMVL3a
— Casey Newton (@CaseyNewton) August 6, 2021
Perhaps, the phrase to be used here is, “desperate times call for desperate measures.”
#iCloud new safety policy pic.twitter.com/2CiaXiahjf
— ria (@yellocato) August 6, 2021
Let us hope that the good intentions aren’t obliterated by sinister motives in the long run.
At face value, Apple's ability to detect known Child Sexual Abuse Material images stored in iCloud Photos using a privacy-preserving hash system seems unconcerning and I trust Apple has good intentions.
— Joe Rossignol (@rsgnl) August 5, 2021
However, not everyone is convinced about the intentions.
Apple plans to modify iPhones to constantly scan for contraband:
“It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops,” said Ross Anderson, professor of security engineering.#Apple #iphone #iCloud #privacy pic.twitter.com/spQnIXQ5mk
— S4RK (@S4RKY) August 6, 2021
https://twitter.com/icanzilb/status/1423524622947991555?s=20
Benny, how is that consistent with Apple's claim that it has no access to innocent images? If all images are uploaded in the clear to iCloud for scanning, Apple has access to them all, illegal and legal alike https://t.co/ApzNTKYLTT
— Ross Anderson (@rossjanderson) August 6, 2021
There has not yet, as far as I can tell, been enough technical detail released yet to fully analyze the Apple CSAM scheme. In particular, it’s unclear what it would take to deliberately or maliciously obtain decryption keys for images that aren’t actually in the official DB.
— matt blaze (@mattblaze) August 6, 2021
https://twitter.com/zackwhittaker/status/1423383256687419393?s=20