Apple has silently removed all references to CSAM on its child safety website. This suggests that a controversial plan to detect images of child sexual abuse on the iPhone and iPad may have been put on hold after serious criticism of the method. Apple scans child sexual abuse (CSAM) material in users’ iCloud photo libraries in August to warn children and their parents when they receive or send sexually explicit photos. Announced a new child safety feature planning suite, including CSAM extensions. Siri and Search Guide.
After the announcement, the feature was criticized by a variety of individuals and organizations, including security researchers, privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), former Facebook security chiefs, politicians, policy groups, and university researchers. It has been. Some Apple employees. Most of the criticisms are Apple’s planned CSAM detections on devices, criticized by researchers for relying on dangerous technology adjacent to surveillance, and ridiculed to identify sexuality images.
Apple originally tried to clear up certain misunderstandings and reassure people by giving thorough information, offering FAQs, new documents, and interviews with corporate executives, among other things. Despite Apple’s efforts, the uproar has not subsided. Apple eventually moved forward with the Messages Communication Safety features deployment, which went live earlier this week with the release of iOS 15.2, but it opted to postpone the CSAM rollout due to the outpouring of criticism it plainly hadn’t anticipated.
“We have decided to take additional time over the coming months to collect input and make changes before delivering these critically important kid safety features,” Apple said, citing feedback from customers, advocacy groups, researchers, and others.
The above comment was posted to Apple’s Child Safety page, but it has now been removed, along with all other mentions of CSAM, raising the possibility that Apple has abandoned the proposal entirely. Apple has been contacted for comment, and we will update this item if we receive a response.
Though the CSAM detection function is no longer referenced on Apple’s website, spokesperson Shane Bauer stated that the company’s plans for CSAM detection have not altered since September, implying that CSAM detection is still on the way.