Apple plans to scan US iPhones for child sexual abuse images

Read More

Apple plans to scan US iPhones for child sexual abuse images

Security researchers fear neuralMatch system could be misused to spy on citizens

Associated Press

Last modified on Fri 6 Aug 2021 03.31 EDT

Apple has unveiled plans to scan its iPhones in the US for images of child sexual abuse, drawing praise from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detected known images of child sexual abuse, called neuralMatch, will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child abuse is confirmed, the user’s account will be disabled and the US National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates. The detection system will only flag images that are already in the centre’s database of known child abuse images. Parents taking photos of a child in the bath presumably need not worry. But researchers say the matching tool – which doesn’t “see” such images, just mathematical “fingerprints” that represent them – could be put to different purposes.

Matthew Green, a cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child abuse images. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Other abuses could include government surveillance of dissidents or protesters. “What happens when the Chinese government says: ‘Here is a list of files that we want you to scan for,'” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child abuse images.

Apple has been under government pressure for years to allow for increased surveillance of encrypted data. Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

But the Electronic Frontier Foundation, a online civil liberties pioneer, called Apple’s compromise on privacy protections “a shocking about-face for users who have relied on the company’s leadership in privacy and security”.

The computer scientist who more than a decade ago invented PhotoDNA, the technology used by law enforcement to identify child abuse images online, acknowledged the potential for abuse of Apple’s system but said it was far outweighed by the imperative of battling child sexual abuse.

“Is it possible? Of course. But is it something that I’m concerned about? No,” said Hany Farid, a researcher at the University of California, Berkeley, who argued that plenty of other programs designed to secure devices from various threats hadn’t suffered from “this type of mission creep”. For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but also employs a system for detecting malware and warning users not to click on harmful links.

Apple was one of the first major companies to embrace end-to-end encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressed the company for access to that information. Apple said the latest changes would roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

“Apple’s expanded protection for children is a gamechanger,” said John Clark, the president and chief executive of the National Center for Missing and Exploited Children. “With so many people using Apple products, these new safety measures have lifesaving potential for children.”

But the Washington-based Center for Democracy and Technology called on Apple to abandon the changes, which it said in effect destroyed the company’s guarantee of end-to-end encryption. Scanning of messages for sexually explicit content on phones or computers in effect breaks the security, it said.

The group also questioned Apple’s technology for differentiating between dangerous content and something as art or a meme. Such technologies were notoriously error-prone, it said.

Apple denies that the changes amount to a backdoor that degrades its encryption. It says they are carefully considered innovations that do not disturb user privacy but rather strongly protect it.

Related articles

You may also be interested in

Headline

Never Miss A Story

Get our Weekly recap with the latest news, articles and resources.
Cookie policy

We use our own and third party cookies to allow us to understand how the site is used and to support our marketing campaigns.