Over a dozen reported cybersecurity experts have criticized Apple for making use of “dangerous technology” in its plans to detect images containing child sexual abuse materials on iPhones.
Dangerous and Ineffective
This criticism comes in the form of of a new 46-page study, where researchers have taken a look at Apple and the European Union’s plans to keep tabs on people’s phones in order to search for illegal content, while calling out the efforts for being “dangerous” and “ineffective.” They believe that should such tech be used, it would provide a way for governments to expand their surveillance agenda.
The features of the scanning tool, which was first announced in August, include on-device scanning of iCloud Photos of users, so as to detect potential Child Sexual Abuse Materials (CSAM) content. Also included will be Communication Safety that will warn children and their parents if they are found to be sending or receiving sexually explicit photo. Siri and Search will also be featuring an expanded CSAM guidance.
The researchers say that the documents that have been released by the European Union serve to suggest that the bloc’s governing body is seeking to bring out a similar program, one that would scan encrypted phones to detect child sexual abuse, while also seeking signs of terror-related imagery and organized crime.
Rising Concerns
Publishing their findings, the team has said that resisting any attempts to “spy on and influence” citizens who abide by the law should be likely “national security priority.” They say that their study had begun even before Apple had made its announcement, and the findings are being published now so as to show the European Union how dangerous the plan can be.
This isn’t the first time that Apple has faced backlash for its plans to use the technology. Privacy experts, security researchers, politicians, academics, and even the company’s own employees, have all called out the decision to install the tool in a future iOS 15 and iPadOS 15 update.
The tech giant had initially tried to dispel any misunderstandings by reassuring users and putting in efforts to get rid of their concerns, but later gave in to the criticism and announced that it would delay rolling out the features, as a means to make “improvements” where needed. Nevertheless, it’s not known what these improvements might be.
The firm has also staunchly said that it will not let the CSAM scanner be used by authoritarian governments.