Apple has made a lot of pro-privacy people mad this week with the announcement of plans to start scanning everyone’s iPhone photos for child sexual abuse material (CSAM).
To do that, it’ll compare all photos on an iCloud-enabled device (including iPads and Macs) to databases of known CSAM images and it’ll be done by checking “hashes.” Think of a hash as a number that represents an image. That number, created when the photo’s data is run through a one-way cryptographic algorithm, is supposed to be unique to the image, so it should be quick and easy for a match to be found. Then Apple employees will review the image share the match with the National Center for Missing and Exploited Children (NCMEC).
By law, American companies have to report child abuse and exploitation imagery on their servers to NCMEC, which then works with law enforcement on an investigation. Other tech giants do the same when emails or messages are sent over their platforms. That includes Google, Microsoft and Facebook. So why are so many privacy advocates up in arms about Apple’s announcement?
It’s because Apple is checking photos on your iPhone, not just on its own servers in the iCloud. It’s going one step beyond what its rivals have done, checking every photo on a device rather than just on a company server. (It’s also scanning images to check whether they’re of nude children, using a different technology, but that’s all done on the device and doesn’t go to Apple. A simple warning comes up, suggesting iPhone users may not want to send or view nude images.)
Alec Muffett, a noted encryption expert and former Facebook security staffer, explained on Twitter that when someone buys a phone, they expect to have control over what’s happening on their property. But Apple is denying that right and “although it ostensibly exists to prevent upload of CSAM to their iCloud platform, they are using the user’s device to do it and making the tectonic-shift statement that ‘it’s ok by us to do this sort of thing to user devices.’”
Muffett and other encryption experts like Johns Hopkins professor Matt Green and NSA leaker Edward Snowden have also raised the alarm that Apple could now be pressured into looking for other material on people’s devices, if a government demands it.
“How such a feature might be repurposed in an illiberal state is fairly easy to visualize. Apple are are performing proactive surveillance on client-purchased devices in order to defend their own interests, but in the name of child protection,” Muffett added. “What will China want them to block?
“It is already a moral earthquake.”
The Electronic Frontier Foundation (EFF) said that the changes effectively meant Apple was introducing a “backdoor” onto user devices. “Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the EFF wrote.
“Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”
Some people like it
Not that everyone is upset by the move. Nicholas Weaver, a computer security expert and lecturer at the University of Berkeley, said on Twitter that he didn’t blame Apple for choosing to risk fighting with oppressive regimes and take a tougher stance on child sexual abuse.
And David Thiel from the Stanford Internet Observatory noted that most people’s images on any internet-connected device are scanned for CSAM imagery.
“This unyielding hostility to reasonable and limited child safety measures drives me up the wall. Even if this compromised privacy—which, as documented, it does not—there are other harms in the world to be balanced with,” he added.