By John Gruber
Jiiiii — Free to download, unlock your anime-watching-superpowers today!
Terrific interview — great questions, asked in the right order:
Panzarino: One of the bigger queries about this system is that Apple has said that it will just refuse action if it is asked by a government or other agency to compromise by adding things that are not CSAM to the database to check for them on-device. There are some examples where Apple has had to comply with local law at the highest levels if it wants to operate there, China being an example. So how do we trust that Apple is going to hew to this rejection of interference if pressured or asked by a government to compromise the system?
Neuenschwander: Well first, that is launching only for U.S., iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren’t the U.S. when they speak in that way, and therefore it seems to be the case that people agree U.S. law doesn’t offer these kinds of capabilities to our government.
But even in the case where we’re talking about some attempt to change the system, it has a number of protections built in that make it not very useful for trying to identify individuals holding specifically objectionable images. The hash list is built into the operating system, we have one global operating system and don’t have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person’s device or set of people’s devices won’t work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don’t believe that there’s a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled no part of the system is functional.
Neuenschwander also confirms that if you’re not using iCloud Photos, none of the system operates:
If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image. None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos, is functioning if you’re not using iCloud Photos.
★ Wednesday, 11 August 2021