Apple’s new child protection tools have been drawing mixed responses ever since they came out on Thursday, with some hailing the move as necessary, and others claiming that the new policies will help governments keep tabs on user data. WhatsApp CEO Will Cathcart has become the latest addition to the latter bandwagon, as he finds the Apple’s new Child Safety tools “very concerning.”
The CEO has also added that WhatsApp does not plan on adopting such systems, even though it and its parent company Facebook Inc. have recently been called out for allegedly trying to read users’ personal messages in bid to target ads.
Tools Against CSAM May Become “Government Spyware”
The tools in question have been put forward in a bid to slow the spread of child sexual abuse material (CSAM) online, leading Cathcart to acknowledge that even though the intentions may be well-placed, the approach is “very concerning” to the world as a whole. The new system introduced by Apple has made it possible for it to “scan” users’ private photos, notably those stored on iCloud Photos, so as to ascertain whether or not there exists any CSAM imagery. As such, Cathcart believes that should the codes for the scanner contain any errors, user privacy may be jeopardized.
Previously, other notable names like the Electronic Frontier Foundation, as well as Epic Games CEO Tim Sweeney, have expressed concerns over the implications that this new Photo scanning tool could have. Some claim that it might soon end up becoming a weapon in the hands of the government to remain privy to users’ personal photos and other data.
An Explanation Into The Issue
Apple has tried to address some of these concerns in an internal memo, where the tech giant says that it will take it upon itself to explain how the system actually works, and will also try to get rid of any misunderstandings that people may have. One may note that despite many allegations, Apple says that its tools do not go through one’s photos, per say. Instead, the scanner takes to comparing the mathematical hashes of the files stored on iCloud with hashes for files that are known to contain content labelled as CSAM. Moreover, the system is apparently only capable of scanning images that are stored on iCloud, and is unable to access locally stored files when iCloud syncing is turned off.
Source: 9to5Mac