With Apple planning to roll out a new tool to scan images for child sex abuse materials (CSAM), it seems like it has provided users with a reason to quit their iPhones. An editorial in The Washington Post has two researchers, both of which claim to have developed a CSAM scanning tool akin to the one being rolled out by Apple, warning against using the same, while asserting that it’s “dangerous.”
A Cause Of Concern
For the unversed, Apple Inc. plans to introduce the detection tool into its iPhones, iPads, and Mac from next year, and the news has left one too many people, from activists to customers to employees to CEOs of other companies, concerned. As per reports, the tool will allow Apple to scan images and messages stored into users’ iPhots and iMessage, in order to detect any possible CSAM. While the intentions for launching such a scanner may be well placed, experts say that it could very easily be turned into a back door for governments to spy on their citizens.
Amid all the drama, the editorial in The Washington Post seems like the final nail in the coffin. The two researchers say that they are behind the “only peer-reviewed article” on how one can go about building a tech that’s similar to that at Apple, and have concluded that the same can be dangerous. Princeton academic Jonathan Mayer and Anunay Kulshrestha believe that the system that they’ve developed can very easily be directed towards “surveillance and censorship.” They add that the design is such that it is not limited to just one specific type of content, and instead, any content-matching database can be swapped for the one fed into the system.
Fueling Worries And Concerns
As such, it seems as if their claims are doing nothing but to further fuel people’s worries about the initiative. The researchers are also posing direct questions towards the company, demanding to know if China, which happens to be Apple’s second-largest market, won’t be repurposing the technology towards spying on people’s Apple devices to detect pro-democracy materials.
And these concerns are not entirely baseless, after all, since just a few months ago, Apple was accused of giving in to China’s demands, and storing the data from Chinese users on a local Chinese database, which is known to be owned by the State itself.
And as if that wasn’t all, Mayer and Kulshrestha further noticed that the content-matching system could give false positive results, or could be altered by malicious actors, in a bid to spy on innocent people.
This comes even after more than 90 civil rights groups, as well as the firm’s own employees, have written to Apple, asking it reconsider its decision of rolling out the tool.