- The Washington Times - Monday, August 9, 2021

Apple is planning to scan its users’ messages and photos for the purpose of protecting children from predators, but privacy advocates say the new system will open the door to other abuses.

Starting with software updates later this year, Apple said it will use “on-device machine learning” to warn about sensitive content in messages and to look into photos stored in the iCloud, Apple’s cloud storage service that people owning iPhones, iPads, and Mac computers use. “Machine learning” is largely synonymous with artificial intelligence and generally refers to the process of computers performing and learning from various tasks without human intervention and programming.

Among the new tools is a feature in Apple’s Messages app that will warn children and their parents if the child is receiving or sending sexually explicit photos. The new function will blur any such photos detected and alert the child’s parent if they do access the photo.

Apple’s new software will also allow it to look for child sexual abuse material (CSAM) and match it against a database of such material maintained by the National Center for Missing and Exploited Children (NCMEC).

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” said Apple in a statement on its website announcing the plan. “This matching process is powered by a cryptographic technology called ’private set intersection,’ which determines if there is a match without revealing the result.”

The Apple device then creates a voucher that encodes a match in the database with additional information about the image that gets uploaded alongside the image to the iCloud.

In a frequently asked questions document published by Apple on Sunday, the company said it would not scan all of the photos on every user’s iPhone.

“By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images and only the images that match to known CSAM,” read Apple’s document. “The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device.”

The moves by Apple won praise from those concerned about how predators use modern technology to target children. Sen. Richard Blumenthal, Connecticut Democrat, said tweeted that Apple’s move was a “welcome, innovative & bold step.”

“This shows that we can both protect children and our fundamental privacy rights,” Mr. Blumenthal tweeted.

But privacy advocates like the Electronic Frontier Foundation disagree. EFF’s India McKinney and Erica Portnoy wrote that while the computer giant’s aim appears well-intentioned, it breaks promises about privacy and encryption to users.

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan not just children’s but anyone’s accounts,” wrote Ms. McKinney and Ms. Portnoy. “That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”

EFF noted that authoritarian governments are likely candidates to abuse the feature, and noted the example of Indian government rules for prescreening content and new laws in Ethiopia governing the removal of online misinformation.

Apple is not alone in developing digital tools to scan for child sexual abuse material. In February, Google said its YouTube engineers had created software that identified re-uploads of child sexual abuse material and have created new “machine learning classifiers” to identify never-before-seen imagery.

According to the MIT Technology Review, Apple is likely developing new tools intended to protect children, with several changes in store in the coming months.

• Ryan Lovelace can be reached at rlovelace@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide