December 1, 2023

Apple on Thursday unveiled changes to iPhones aimed at exposing child sexual abuse cases, a move parents and police are likely to like, but which is already worrying privacy watchdogs.

Later this year, iPhones will use complex technology to detect child sexual abuse images, commonly known as child pornography, that users upload to Apple’s iCloud storage service, the company said. Apple also said parents will soon be able to turn on a feature that can report when their children send or receive nude photos in a text message.

Apple said it designed the new features to protect user privacy, including ensuring that Apple never sees or finds out nude pictures exchanged in a child’s text messages. Scanning is done on the child’s device and the notifications are only sent to the parents’ devices. Apple provided quotes from some cybersecurity experts and child safety groups who praised the company’s approach.

Other cybersecurity experts were still concerned. Matthew D. Green, a cryptography professor at Johns Hopkins University, said Apple’s new features set a dangerous precedent by developing surveillance technologies that law enforcement agencies or governments could exploit.

“They sold privacy to the world and got people to trust their devices,” said Mr. Green. “But now they are basically capitulating to the worst possible demands of any government. I don’t see how they should say no from now on. “

Apple’s moves follow a 2019 New York Times investigation that revealed a global criminal underworld that exploited flawed and inadequate efforts to contain the explosion of images of child sexual abuse. The investigation found that many technology companies were unable to adequately monitor their platforms and that the amount of such content increased dramatically.

While the material is older than the internet, technologies such as smartphone cameras and cloud storage have enabled the images to be more widely disseminated. Some images have been around for years, traumatizing and persecuting the people depicted.

But the mixed reviews of Apple’s new features show the fine line technology companies must walk between helping public safety and ensuring customer privacy. Law enforcement officials have complained for years that technologies like smartphone encryption have crippled criminal investigations, while technical executives and cybersecurity experts have argued that such encryption is vital to protecting people’s data and privacy.

In Thursday’s announcement, Apple attempted to thread this needle. It said it developed a way to eradicate child robbers that won’t compromise iPhone security.

To detect the child sexual abuse material uploaded to iCloud, or CSAM, iPhones use a technology called image hashes, Apple said. The software reduces a photo to a unique set of numbers – a kind of picture fingerprint.

Let us help you protect your digital life

The iPhone operating system will soon be storing a database of hashes of known child sexual abuse material, provided by organizations such as the National Center for Missing & Exploited Children, and comparing those hashes to the hashes of every photo on a user’s iCloud to see if this is there is a game.

Once there is a certain number of matches, the photos are shown to an Apple employee to ensure that they are indeed pictures of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children and the user’s iCloud account will be locked.

Apple said this approach means that people without child sexual abuse material on their phones would not see their photos from Apple or the authorities.

“If you save a collection of CSAM material, yes, that’s bad for you,” said Erik Neuenschwander, Apple’s chief data protection officer. “But it’s no different for the rest of you.”

Apple’s system does not scan videos uploaded to iCloud, although perpetrators have been using the format for years. In 2019, the number of videos reported to the National Center exceeded the number of photos for the first time. The center often receives multiple reports for the same content.

US law requires technology companies to report child sexual abuse cases to the authorities. Apple has reported fewer cases in the past than other companies. Last year, for example, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. This huge loophole is due in part to Apple’s decision not to search for such material, citing the privacy of its users.

Apple’s other feature, which will scan photos in text messages, will only be available to families who share Apple iCloud accounts. When parents turn it on, their child’s iPhone analyzes every photo received or sent in a text message to see if it contains nudity. Nude photos sent to a child will be blurred and the child will have to decide whether to view them. If children under 13 want to view or send a nude photo, their parents will be notified.

Mr Green said he feared such a system could be abused because it showed law enforcement and governments that Apple now has a way to tag certain content on a phone while maintaining the encryption. Apple previously argued to authorities that encryption prevents certain data from being accessed.

“What if other governments ask Apple to use this for other purposes?” Asked Mr. Green. “What will Apple say?”

Mr. Neuenschwander denied these concerns, saying that precautions were in place to prevent abuse of the system and that Apple would oppose such demands by a government.

“We’ll let them know that we didn’t build what they’re thinking of,” he said.

The Times reported earlier this year that under pressure from the Chinese government, Apple had compromised the private data of its Chinese users in China and proactively censored apps in the country.

Hany Farid, a computer science professor at the University of California, Berkeley who helped develop early image hashing technology, said all possible risks in Apple’s approach are worth child safety.

“With proper security in place, I think the benefits outweigh the disadvantages,” he said.

Michael H. Keller and Gabriel JX Dance contributed the reporting.