Rights groups write to Apple over sex abuse scanning tool

Over 90 policy and rights groups across the globe have sent an open letter to Apple Inc., expressing concern over its plans of scanning iMessages and iPhotos for sex abuse. In their letter, the groups urge the tech giant to let go of its plans to keep an eye on the messages and images stored on its users’ phones.

The letter highlights the concerns of these groups over the potential for these scanning tools to be put to inappropriate use by repressive governments, among other entities, in a bid to censor and silence free speech. The bodies claim that while the scanner may have been developed keeping users’ best interests at heart, it is capable of posing a threat to privacy and security of people, and could result in “disastrous consequences,” especially for children.

A Useful Tool, Or A Weapon To Censor?

The news was first reported by Reuters, and details how the rights groups are concerned over the new tool that Apple announced earlier this month. For the unversed, the company has revealed that it is developing a tool that will be able to scan photos and messages on iPhones, to detect possible child sex abuse materials (CSAM). The same has caused an uproar among users, employees, and right groups alike, with  apprehensions regarding the possibility that the scanner may be used by governments to keep tabs on their citizens.

Eva Galperin, from digital-rights group Electronic Frontier Foundation, asserts that by building a security back door into its iPhones, Apple will essentially be allowing unauthorized entities to access the data more easily. She adds that the danger is very much real, and requires prompt action.

Not only rights groups, even cybersecurity experts have expressed their worry over the development, as have the heads of tech biggies Epic Games and WhatsApp. Even Edward J. Snowden, the notorious former intelligence contractor charged with leaking the government’s surveillance documents, has called Apple out for creating a tool that could be a weapon in governments’ hands to spy on people, and, if necessary, censor their speech.

Rights groups Apple sex abuse
Image Credits: TechCrunch

In fact, even the open letter has more than 8,000 have signed the open letter asking Apple to scrap its plans.

Small Victories In, But No Solid Win

While Apple has not yet officially changed its stance on the matter, it did say a few days ago that instead of searching for all images that can classify as CSAM, it will shift its focus only to those images that have been flagged in multiple countries. An official alert will be sent by the scanner to Apple, if some 30 photos containing CSAM are detected in a particular device, to seek human review.