Is Apple actually snooping on your photos? Jefferson Graham wrote an article last week warning this based on the company’s child safety announcement. An attention-grabbing headline? Certainly. Accurate? It’s complicated.
There has been much criticism from privacy advocates, notably from the EFF and Edward Snowdon. This criticism is warranted, however, that criticism should very much be based on technical elements rather than hyperbole.
So in laymen’s terms, what’s going on?
1) Families enrolled in iCloud Family Sharing will get tools to counter the sharing of explicit content.
If you have Family Sharing enabled and Apple knows that a user is under the age of 13, the device will scan all messages, both sent and received, for sexually explicit content.
The key here is that this feature is only enabled for users under the age of 13 using the Messages app. Parents can also switch on a feature that allows them to get alerts if children ignore a warning about the message.
So is Apple snooping on your photos in this instance? In my eyes, the answer is no.
2) All users who use iCloud Photos will have their photos scanned against a codebase (known as a hash) to identify Child Sexual Abuse Material (CSAM).
First, we need to understand what a hash is. Images connected to iCloud Photos are analyzed on the device and a unique number is assigned to it. The technology is clever enough that if you edit a photo through cropping or filters, the same number is assigned to it.
The National Center for Missing and Exploited Children provided Apple a list of hashes that are known CSAM photos. If your photo does not match that hash, the system moves on. The actual photo isn’t visible to anyone.
If a match is found, that match is added to a database against your iCloud account. If that database grows to a number (the specifications of which are not publicly known), Apple disables your iCloud account and send a report to the NCMEC.
So is Apple Snooping on your photos in this scenario? Maybe. It depends on what you consider snooping. Apple can’t see your photographers, only the hash and then they check that hash against a known CSAM hash.
Bear in mind that this is only enabled for those who use the photos app attached to an iCloud account, therefore you have other options (like using Google Photos) if you aren’t comfortable with the analysis of your photos.
It is worth remembering that all Android and Apple built devices already analyze your photos to be able to make them searchable. If you have a pet, type pet into the search box and pets appear. Analyzing photos is not a new technology, but CSAM detection extends the capabilities for the purposes of what Apple see as the common good.
3) Apple is adding guidance to Siri and Search related to CSAM.
This has nothing to do with scanning photos. If you search (using the iPhone search, not Safari), or ask Siri about CSAM content, it will provide you with links on how to report CSAM or tell you that interest in the topic can be harmful or problematic.
This will have the least impact on users, as I’m not sure people ask Siri about CSAM anyway! You can read Apple’s full explanation of that in this document.
To Summarize
1) Explicit content checks take place on devices known to Apple to belong to a child under 13 through iCloud family sharing. If you are over 13, your photos aren’t scanned.
2) Your iCloud-connected photo library will have a unique number (a hash) assigned to each photo. If that number matches a known CSAM hash, it will be added to a database within your iCloud account. If you have too many photos of this type, your account may be disabled and reported to the authorities.
3) You have a choice on whether or not you want this technology to run on your phone. You can decide not to use iCloud to store your photos or opt out of family sharing for your children.
Now that we have delved beyond the hyperbole, you are in a good position to make an informed decision about this technology. I encourage you to read both the criticism and praise for this method and make up your mind based on that.