Apple’s privacy chief Erik Neuenschwander has detailed some of the projections built into the company’s CSAM scanning system that prevent it from being used for other purposes – including clarifying that the system performs no hashing if iCloud Photos is off.
The company’s CSAM detection system, which was announced with other new child safety tools, has caused controversy. In response, Apple has offered numerous details about how it can scan for CSAM without endangering user privacy.
In an interview with TechCrunch, Apple privacy head Erik Neuenschwander said the system was designed from the start to prevent government overreach and abuse.
For one, the system only applies in the U.S., where Fourth Amendment protections already guard against illegal search and seizure.
“Well first, that is launching only for US, iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren’t the US when they speak in that way,” Neuenschwander said “And therefore it seems to be the case that people agree US law doesn’t offer these kinds of capabilities to our government.”
But even beyond that, the system has baked-in guardrails. For example, the hash list that the system uses to tag CSAM is built into the operating system. It can’t be updated from Apple’s side without an iOS update. Apple also must release any updates to the database on a global scale — it can’t target individual users with specific updates.
The system also only tags collections of known CSAM. A single image isn’t going to trigger anything. More then that, images that aren’t in the database provided by the National Center for Missing and Exploited Children won’t get tagged either.
Apple also has a manual review process. If an iCloud account gets flagged for a collection of illegal CSAM material, an Apple team will review the flag to ensure that it’s actually a correct match before any external entity is alerted.
“And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don’t believe that there’s a basis on which people will be able to make that request in the US,” Neuenschwander said.
Additionally, Neuenschwander added, there is still some user choice here. The system only works if a user has iCloud Photos enabled. The Apple privacy chief said that, if a user doesn’t like the system, “they can choose not to use iCloud Photos.” If iCloud Photos is not enabled, “no part of the system is functional.”
“If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image,” the Apple executive said. “None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos is functioning if you’re not using iCloud Photos.”
Although Apple’s CSAM feature has caused a stir online, the company refutes that the system can be used for any purposes other than detecting CSAM. Apple clearly states that it will refuse any government attempt to modify or use the system for something other than CSAM.