Friday August 13, 2021

The Latest from Apple newsroom that can affect your photos

apple-newsroom-03

Jefferson Graham wrote an article last week warning this based on the company’s child safety announcement. An attention-grabbing headline? Certainly. Accurate? It’s complicated. Rumors has it that Apple can and might snoop on your photos if required. Owing to the data policy controversy last year, the news is being debated worldwide.

There has been much criticism from privacy advocates, notably from the EFF and Edward Snowdon. This criticism is warranted, however, that criticism should very much be based on technical elements rather than hyperbole.

Let us see what Snowden tweeted so far!

apple-newsroom-01

Families enrolled in iCloud Family Sharing will get tools to counter the sharing of explicit content.

If you have Family Sharing enabled and Apple newsroom knows that a user is under the age of 13, the device will scan all messages, both sent and received, for sexually explicit content.

The key here is that this feature is only enabled for users under the age of 13 using the Messages app. Parents can also switch on a feature that allows them to get alerts if children ignore a warning about the message.

So is Apple snooping on your photos in this instance? In my eyes, the answer is no.

apple-newsroom-02

Apple Newsroom on Child Sexual Abuse Material (CSAM).

First, we need to understand what a hash is. Images connected to iCloud Photos are analyzed on the device and a unique number is assigned to it. The technology is clever enough that if you edit a photo through cropping or filters, the same number is assigned to it.

The National Center for Missing and Exploited Children provided Apple a list of hashes that are known CSAM photos. If your photo does not match that hash, the system moves on. The actual photo isn’t visible to anyone.

If a match is found, that match is added to a database against your iCloud account. If that database grows to a number (the specifications of which are not publicly known), Apple disables your iCloud account and send a report to the NCMEC.

So is Apple Snooping on your photos in this scenario? Maybe.

It depends on what you consider snooping. Apple can’t see your photographers, only the hash and then they check that hash against a known CSAM hash.

Bear in mind that this is only enabled for those who use the photos app attached to an iCloud account, therefore you have other options (like using Google Photos) if you aren’t comfortable with the analysis of your photos.

It is worth remembering that all Android and Apple built devices already analyze your photos to be able to make them searchable. If you have a pet, type pet into the search box and pets appear. Analyzing photos is not a new technology, but CSAM detection extends the capabilities for the purposes of what Apple see as the common good.

Apple Newsroom on Siri and Search related to CSAM.

This has nothing to do with scanning photos. If you search (using the iPhone search, not Safari), or ask Siri about CSAM content, it will provide you with links on how to report CSAM or tell you that interest in the topic can be harmful or problematic.

This will have the least impact on users, as I’m not sure people ask Siri about CSAM anyway! You can read Apple’s full explanation of that in this document.

Apple’s technology for analyzing images

Apple’s technical summary on CSAM detection includes a few privacy promises in the introduction. “Apple does not learn anything about images that do not match the known CSAM database,” it says. “Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.”

Apple newsroom tells us that hashing technology is called NeuralHash, and it “analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value,” Apple wrote.

Before an iPhone or other Apple device uploads an image to iCloud, the “device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.”

Using “threshold secret sharing,” Apple’s “system ensures that the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content,” the document said. “Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images.”

While noting the 1-in-1 trillion probability of a false positive, Apple said it “manually reviews all reports made to NCMEC to ensure reporting accuracy.” Users can “file an appeal to have their account reinstated” if they believe their account was mistakenly flagged.

Verdict!

Now that we have delved beyond the hyperbole, you are in a good position to make an informed decision about this technology. I encourage you to read both the criticism and praise for this method and make up your mind based on that.


Disclosure: William Damien worked part-time at an Apple retail location seven years ago. The opinions expressed above are solely those of the author.


Image credits: Header photo licensed via Depositphotos.

One Response

Leave a Reply

Your email address will not be published. Required fields are marked *