Apple has tried to deflect criticism of its controversial CSAM protection system, but in doing so has illustrated just what’s at stake.
The big conversation
Apple last week announced it would introduce a collection of child protection measures inside iOS 15, iPad OS 15 and macOS Monterey when the operating systems ship this fall.
Among other protections, the on-device system scans your iCloud Photos library for evidence of illegal collections of Child Sexual Abuse Material (CSAM). It is, of course, completely appropriate to protect children, but privacy advocates remain concerned about the potential for Apple’s system to become full-fledged surveillance.
In an attempt to mitigate criticism, Apple has published fresh information in which it tries to explain a little more about how the tech works. As explained in this Apple white paper, the tech turns images on your device into a numeric hash that can be compared to a database of known CSAM images as you upload them to iCloud Photos.
Making a hash of it
While the analysis of the image takes place on the device using Apple’s hash technology, not every image is flagged or scanned – just those identified as CSAM. Apple argues this is actually an improvement in that the company at no point scans the entire library.
“Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risk for all users,” the company’s new FAQ says. “CSAM detection in iCloud Photos provides significant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM.”
Despite these reassurances, huge concerns still exist over the extent to which the system can be extended to monitoring other forms of content. After all, if you can turn a collection of CSAM images into data that can be identified, you can turn anything into data personal information can be scanned against. Privacy advocate Edward Snowden warns, “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”
Take it on trust?
Apple says it has no intention of pushing its system into other domains. In its FAQ, it writes:
“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”
On the surface that seems reassuring. But it stands to reason that now that this technology exists, those nations that may want to force Apple to extend on-device surveillance for matters beyond CSAM will use every weapon they have to force the issue.
“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content,” warned Electronic Frontier Foundation.
Preventing this will be a struggle. Which means Apple has a fight ahead.
The privacy war has begun
It may be that this is a fight Apple wants to have. After all, we know it has taken many significant steps to protect user privacy across its ecosystems and we also know it supports changes in law to protect privacy online.
“It is certainly time, not only for a comprehensive privacy law here in the U.S., but also for worldwide laws and new international agreements that enshrine the principles of data minimization, user knowledge, user access and data security across the globe,” CEO Tim Cook said this year.
You might argue the high-profile introduction of Apple’s child protection measures has invoked a wider conversation concerning rights and privacy in an online and connected world. The only way to prevent the system from being extended beyond CSAM is to support Apple in resisting pressure to do so.
In the absence of such support, Apple is unlikely to prevail against every government alone. In the event the company is not given support, the question is when, not if, it will be forced to concede. And yet, governments can still reach an accord around privacy online.
The stakes are high. The risk is that the bricks along the sunlit path to justice Cook has long sought to place may well become bricks in the wall to prevent any such journey taking place.
The benefit is that a determined effort may enable the creation of frameworks that enable that path’s end.
The controversy reflects how rocky that road seems to have become.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Copyright © 2021 IDG Communications, Inc.