US National Security Agency whistleblower Edward Snowden has joined the chorus of privacy advocates calling for Apple to abandon its plan to scan practically every iPhone for child sexual abuse material (CSAM) over concerns the system will expand to other content in the future.
“I can’t think of any other company that has so proudly, and so publicly, distributed spyware to its own devices—and I can’t think of a threat more dangerous to a product’s security than the mischief of its own maker,” Snowden writes in his newsletter, known as Continuing Ed. “There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple’s all-too-flexible company policy, something governments understand all too well.”
Snowden is best known for the 2013 revelation of a number of mass surveillance programs conducted by the NSA and its counterparts in the Five Eyes intelligence alliance. He then sought asylum in Hong Kong, and later Russia, where he became a permanent resident in 2020.
He’s referring here to Apple’s plan to compare known CSAM to images stored on an iPhone connected to its iCloud Photos service. Rather than scanning the photo when it reaches the cloud, Apple is using on-device processing to compare a hash of the image to an index of CSAM from a national database, then sending a report to its employees when it finds a potential match.
Once a certain number of reports is reached, the company will manually compare the relevant image to the CSAM database. If there’s a match, a report will be sent to the National Center for Missing and Exploited Children and the user’s iCloud account will be deactivated. Scanning the photos on-device is supposed to offer a compromise between privacy and the safety of children.
Apple’s plan has been roundly criticized for its technical implementation, however, because it’s easy for the company’s hashing process to encounter false positives and false negatives alike. Snowden notes that people hoarding or distributing CSAM also have an easy way to evade detection: disable iCloud Photos on the devices they’re using to engage with that material.
But the broader concern is that Apple won’t be able to limit this on-device processing to CSAM. Two researchers who published what they described as “the only peer-reviewed publication on how to build a system like Apple’s,” Jonathan Mayer and Anunay Kulshrestha, said they believed such a tool would be too dangerous to implement even if it were introduced with the best intent.
“Our system could be easily repurposed for surveillance and censorship,” Mayer and Kulshrestha said in an op-ed for The Washington Post. “The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.”
Snowden echoes those concerns. “Apple’s proposal to make their phones inform on and betray their owners marks the dawn of a dark future, one to be written in the blood of the political opposition of a hundred countries that will exploit this system to the hilt,” he says. “See, the day after this system goes live, it will no longer matter whether or not Apple ever enables end-to-end encryption, because our iPhones will be reporting their contents before our keys are even used.
“I would say there should be a law, but I fear it would only make things worse,” he adds. “To put it bluntly, this is not an innovation but a tragedy, a disaster-in-the-making.”