A password will be e-mailed to you.

Apple drops plans for controversial child abuse scanning (CSAM)

iPhone Photos Scanning


Apple, the most popular technology giant is always striving for small aesthetic improvements in its products that have a huge significance for the Apple ecosystem but more than that, the company works on impeccable features that are very difficult to beat in terms of performance. Apple does launch some features later than Samsung or Android but then it makes sure to add value to its users by syncing it perfectly with the Apple ecosystem, making the user almost addicted to what Apple has to offer.

Anyhow, some of Apple’s features enter the “controversy” land and there is absolute silence from Apple beyond that point. One such feature that was introduced was the scan for Child Sexual Abuse Material photos on iPhones and iCloud. This feature was basically meant to scan for any photos that could fit into the category of child abuse. While, the thought behind it could have been good but the negative impact that this feature could have, got Apple into a lot of controversies. So much so, that Apple went completely silent until today when they ultimately removed all mentions of this feature from their website.

Yes, according to a report by MacRumors, Apple has removed all mentions of their controversial scanning feature on its child safety website. When you visit the website now, you will only see the optional nude photo detection in Messages and pop-up intervention when people search for child abuse or child exploitation references.

However, does this mean that Apple is done with this feature altogether? Well, it is Apple and we think that it might be improving its policies and the extent of this feature to minimize the potential exploitation of privacy that threatens the user. As mentioned in a report by Engadget, Apple was already searching for hashes in iCloud Photos backup, known as CSAM. Had this feature been enabled by Apple, if iCloud Photos was enabled and the system found certain hashes in the local photo library, Apple would decrypt the suspected safety vouchers which are apparently included with every image and manually review the image for reporting to the National Center for Missing and Exploited Children, which would definitely get the cops involved, ending up in saving the child.

However, imagine the negative impact of having your iPhone Photos searched by Apple every now and then? Could the government force Apple to search through the public gallery for some evidence, related to some cases? Will Apple be in a position to say “No”? There is a lot of depth in this controversial feature and Apple might have taken the right decision by halting its progress in the Apple ecosystem. What do you think?




No more articles
Send this to a friend