Apple says researchers can vet its child safety features. But it’s suing a startup that does just that

Apple Inc. filed an appeal on Tuesday in a copyright case against security firm Corellium, which assists researchers in examining programmes such as Apple’s anticipated new method for detecting child sex abuse photographs.

Apple’s copyright allegations against Corellium, which makes a replicated iPhone that researchers use to analyse how the tightly limited gadgets work, were dismissed by a federal judge last year.

Corellium’s key customers include security specialists, and the holes they discovered have been reported to Apple for financial bounties and exploited elsewhere, including by the FBI in cracking the phone of a mass shooter in San Bernardino, California.

Apple’s software is difficult to study, and the specialised research phones it provides to pre-selected experts come with a slew of limitations. The corporation did not respond to a request for comment. Apple had just settled other claims with Corellium relating to the Digital Millennium Copyright Act, avoiding a trial, so the appeal came as a surprise.

Experts were also surprised that Apple resurrected a legal battle with a major research tool provider so soon after claiming that researchers would provide a check on its contentious plan to scan customer devices.

“Enough is enough,” said Corellium Chief Executive Amanda Gorton. “Apple can’t pretend to hold itself accountable to the security research community while simultaneously trying to make that research illegal.”

According to Apple’s plan, software will automatically analyse photos destined for upload to iCloud online storage from phones or PCs to determine if they match digital identifiers of known child abuse images. If there are enough matches, Apple personnel will check to see if the photographs are illegal, then terminate the account and report the user to law enforcement.

David Thiel of the Stanford Internet Observatory tweeted,

“We’ll prevent abuse of these child safety mechanisms by relying on people bypassing our copy protection mechanisms,’ is a pretty internally incoherent argument.”

Digital rights groups have opposed to the concept since Apple has promoted itself as being committed to user privacy, whereas other companies only check content after it has been saved online or shared.

Governments might possibly force Apple to scan for forbidden political material as well, or to target a single user, according to one of their primary reasons.

Apple executives defended the initiative, claiming that researchers could verify the list of prohibited photographs and study the data given to the business in order to keep it honest about what it was looking for and from whom. According to one executive, such assessments were better for overall privacy than if the scanning took place in Apple’s storage, which kept the coding private.