Advertisement

Apple says feature to find child images doesn’t create a back door

An Apple logo is illuminated at a store in Munich, Germany.
An illuminated Apple logo at a store in Munich, Germany. Apple is pushing back on criticisms that a new system for detecting child pornography on users’ phones constitutes an invasion of privacy.
(Matthias Schrader / Associated Press)
Share

Apple Inc. responded to concerns about its upcoming child safety features, saying it doesn’t believe its tool for locating child pornographic images on a user’s device creates a back door that reduces privacy.

The Cupertino, Calif.-based technology giant made the comments in a briefing Friday, a day after revealing new features for iCloud, Messages and Siri to combat the spread of sexually explicit images of children. The company reiterated that it doesn’t scan a device owner’s entire photo library to look for abusive images but instead uses cryptography to compare images with a known database provided by the National Center for Missing & Exploited Children.

Some privacy advocates and security researchers were concerned after Apple’s announcement that the company would scan a user’s complete photo collection; instead, the company is using an on-device algorithm to detect the sexually explicit images. Apple said it would manually review abusive photos from a user’s device only if the algorithm found a certain number of them. The company also said it can adjust the algorithm over time.

Advertisement

Apple said it isn’t breaking end-to-end encryption with a new feature in the Messages app that analyzes photos sent to or from a child’s iPhone for explicit material, nor will the company gain access to user messages. Asked during the briefing whether the new tools mean the company will add end-to-end encryption to iCloud storage backups, Apple said it wouldn’t comment on future plans. End-to-end encryption, the most stringent form of privacy, lets only the sender and receiver see a message sent between them.

Apple pauses human reviews of voice command recordings.

Aug. 2, 2019

On Thursday, the Electronic Frontier Foundation said Apple is opening a back door to its highly touted privacy features for users with the new tools. “It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” the EFF said in a post on its website. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

Apple said the system had been in development for years and wasn’t built for governments to monitor citizens. The system is available only in the U.S., Apple said, and works only if a user has iCloud Photos enabled.

Dan Boneh, a cryptography researcher tapped by Apple to support the project, defended the new tools.

“This issue affects many cloud providers,” he said. “Some cloud providers address this problem by scanning photos uploaded to the cloud. Apple chose to invest in a more complex system that provides the same functionality, but does so without having its servers look at every photo.”

Advertisement