Is it iPhone scanning? Or is Apple trying to avoid rifling through your personal data?
Last week, the company introduced an upcoming iOS 15 function to stop online child sexual abuse imagery. It immediately sparked controversy because the system will tap into the iPhone’s “on-device” processing to detect illegal content stored on an iCloud Photos account.
The words “iPhone scanning” made headlines, but according to Apple, the phrasing overshadows the real privacy benefits of on-device processing.
On Monday, the company’s director of user privacy, Erik Neuenschwander, held a press briefing to clear the air. According to him, Apple’s approach doesn’t amount to traditional scanning, where the company’s servers will view every file on board the iPhone and learn their contents.
Instead, the company is using on-device processing to create a log. An iPhone will essentially compare and match the uploaded photo to a national database of indexed child sexual abuse material (CSAM). The device will then generate a cryptographic “safety voucher” about whether the image matches a known CSAM file already circulating on the internet.
The voucher will then be stored on Apple’s servers. However, the voucher itself won’t reveal the contents of the image file. Apple can only decrypt the voucher if the iCloud account has passed a threshold on suspected CSAM.
Confusing, But Less Invasive?
The approach isn’t exactly easy to understand. But according to Neuenschwander, it’s far less invasive than what existing child sexual imagery detection systems can do for email and cloud storage platforms, which can involve indiscriminate scanning of every image in a user’s account. Rather, Apple is attempting to create a layer of privacy that’ll prevent the company from learning about your image files—but to do so requires on-device processing.
The on-device processing has caused security researchers and the head of WhatsApp to ring alarm bells about the so-called scanning, warning the same system could be abused for wide scale surveillance on people’s personal hardware. However, Neuenschwander says the term “on-device scanning” mischaracterizes the technology at play.
Ivan Krstić, head of security and architecture at Apple, added that other approaches to scan for CSAM on cloud platforms are usually opaque and must view every image. Apple, on the other hand, plans on making its own CSAM detection algorithms open to security researchers, according to Krstić. During the press briefing, Apple also emphasized it expects the upcoming system to make a difference in helping law enforcement stop child sex predators.
The CSAM system is going to be first deployed in the US after iOS 15 rolls out in the fall. When or if Cupertino will expand the system to other countries was left unsaid. Once the system arrives, It’ll also create safety vouchers for photos already stored in iCloud Photos.
If you’re not a fan, Apple says a user can simply disable iCloud Photos on an iPhone. The on-device processing will not cover the Photos app or any other image files stored on the device. The company has also vowed to never let governments access to the system.