Apple Inc. has found itself being faced with an onslaught of questions and backlash from employees, users, and even app developers alike, after it announced its plans to release a new software that would allow for the scanning of images stored on iPhones, to help detect any content pertaining to child sexual abuse. Following the pressure from multiple parties, a modification to the original plan has been announced. Apple has changed its course of action on scanning for CSAM imagery, and will only search for those child sex abuse images that have already been flagged in multiple countries.
Concerns And Criticism
This comes even after the tech biggie had announced in early August that that it was rolling out a new software that would scan photos on iPhones, iPad, and Mac, to detect any content depicting sexual abuse to children. Following the revelation, the world had been quick to point out that the tool could potentially be used by “repressive governments” to keep tabs on people. These claims were backed by the likes of Epic Games CEO Tim Sweeney and WhatsApp CEO Will Cathcart.
On Friday, while announcing the change, Apple had also admitted to having had performed poorly in addressing the “jumbled” communication and the misunderstandings surrounding the program.
30 Images Needed To Alert Apple
The firm also informed users that it would start with a limit of 30 child sex abuse images, which will need to be detected before Apple is notified notified a human review. This number could go down in the future, as the program matures. The images will be scanned against those known by the National Center for Missing and Exploited Children, to contain CSAM. Only those images that are uploaded onto iCloud will be scanned, and pictures stored onto the local memory will be spared.
The Center is, so far, the only clearinghouse that has agreed to partner with Apple as per its plans, and the firm has said that it will first launch its program only in the United States. However, only those images that have been flagged by multiple countries.
Riana Pfefferkorn, an encryption and surveillance researcher at Stanford, believes that the change has come about due to the efforts made by critics, something Apple doesn’t seem to be too keen on admitting. This might be seen as a win for the many people who had taken issue with the new scanner, as they felt that the same could provide governments with a weapon to spy on their citizens by accessing their private data.
Source: The Sun