Marc’s Micro Blog


Concerns About Apple’s On-Device Image Fingerprint Scanning

#

Update on August 10th, 2021

Concerns About Apple’s On-Device Image Fingerprint Scanning

Apple's Chief Privacy Officer seemed to say CSAM scanning of iCloud servers was already happening back in January 2020 and Apple's Privacy Policy has allowed it since May 2019. However, it is now unclear whether iCloud server CSAM scanning has actually been happening.

Apple now seems to be telling media that server-based CSAM scanning will start when on-device scanning starts. If so, my understanding around existing scanning was likely incorrect. I have made significant changes to this post to no longer assume that any scanning has been happening.


For many years now, the world has experienced an increasing lack of privacy. It’s been deeply troubling, but Apple has been a beacon of hope. To see a tech company with Apple’s market capitalization and brand power champion privacy as a core value and human right has made me feel that all was not lost.

Apple recently announced that iOS and iPadOS will use image fingerprint hashes to do on-device checks for the presence of known Child Sexual Abuse Material (CSAM) before the images are uploaded to iCloud Photos in the United States. This system may scale to other countries over time.

I disagree with this decision, and I will discuss why in detail. But, first, let me be clear: I hate CSAM. I don’t want it to be on iCloud Photos or anywhere else. In fact, during Tumblr’s first five years, I managed a Trust & Safety team that routinely took down potential CSAM content and reported it to the National Center for Missing and Exploited Children (NCMEC).

Second, let me also be clear: I understand Apple and NCMEC have positive intent here. They are trying to protect children from sexual abuse. It is a noble goal that I support. I am not seeking to demonize them, just to share an opposing point of view.

As with many things in life, what we have here is a question of how to best balance competing needs.

On one side of the scale, you have an understandable desire to keep known CSAM off iCloud Photos. On the other side, you have an understandable desire to protect user privacy.

How would I balance that scale? I would focus on these points:

Let’s explore those points in detail.

Server-Based vs. On-Device Scans

Server-based scans can find known CSAM if they are done comprehensively. So, what is the privacy justification for turning every iPhone and iPad in America into an on-device CSAM fingerprint scanner for iCloud Photos? Is that solution striking the right balance for the competing needs here?

It seems many people feel on-device fingerprint scanning that only sends matches to Apple is more privacy-friendly than server-based scanning. I disagree with this view.

Yes, with on-device scanning non-matches are not sent to Apple. That may seem like a big privacy win, but it’s actually a pretty small one.

iCloud Photos lack end-to-end encryption. Photos are encrypted on Apple’s servers, but Apple has the encryption keys. So, it can still view, fingerprint, or fingerprint scan any iCloud Photo on its servers at any time. (Many iCloud services similarly lack end-to-end encryption, including Backups, Drive, and Mail.)

On-device fingerprint scanning doesn’t meaningfully enhance user privacy compared to server-based scanning. But, on-device scanning does create a new potential way for user privacy to be violated on a massive scale.

Governments, Data Sovereignty, and Pandora's Box

Apple has historically worked hard to make its devices encrypted black boxes. This has allowed it to tell governments that it has no way to see what is on a device, only what is on Apple servers that isn’t end-to-end encrypted.

With the notable exception of China, Apple has located its servers in countries with laws that align with its focus on privacy. For example, if Apple doesn’t want to scan servers located in the US at the behest of another country, it probably doesn’t legally have to. In fact, there might even be US law preventing the sort of scanning that the other country wants.

Some people claim that on-device scanning is just moving scanning from servers to devices and thus nothing to worry about. But that isn’t true. On-device scanning could compromise user privacy greatly. It's potentially a Pandora's Box for user privacy.

Some governments may claim that data sovereignty means they can control how fingerprinting is used on devices that are physically in their country. Instead of devices being encrypted black boxes, they are now potentially data stores under the jurisdiction of local governments.

Apple is creating an on-device fingerprinting system focused on scanning for known CSAM in iCloud Photos. But, from a technical perspective, fingerprints for any photo could be scanned for. With a little work, fingerprints for any type of file could be made and scanned for. Perhaps all fingerprints could be harvested from devices rather than only matches.

While the term often gets overused, the potential for mass privacy violation here is Orwellian. Authoritarian regimes and court orders under seals could try to compel Apple to extend the use of fingerprinting to identify or spy on whistleblowers, activists, journalists, watchdogs, political opponents, LGBTQI persons, and more.

Apple tells us not to worry. Erik Neuenschwander, the company’s User Privacy Manager, says in the same Times article that Apple will simply reject such demands from governments:

“We will inform them that we did not build the thing they’re thinking of,” he said.

Many governments don’t deal well with rejection. They can impose import tariffs and fines, close physical and online stores, revoke business licenses, imprison employees, and in some cases block apps and services with firewalls.

Will Apple be able to resist that sort of pressure if a government tries to apply it? Apple's experience in China gives me doubts. That said, Apple has lots of money, lawyers, and lobbyists. It may be able to say no in some cases. Or, it may be able to accept penalties or lost business in smaller markets.

However, what Apple can get away with may be very different from what smaller and more localized tech companies can get away with. And, the course that Apple sets here could have major implications for whether other companies implement on-device scanning.

Apple’s Global Privacy Leadership

I have a sheepish confession to make. When I first heard the news about Apple’s on-device scanning plan, I got angry. I dashed off a couple of hot-take tweets shaming Apple and Tim Cook for using devices and a service I pay for to surveil me. I said I would stop using Apple products over this. I suppose I was a part of what NCMEC has termed the screeching voices of the minority.

After I had some time to calm down, I deleted those tweets. This post is my attempt to be a more thoughtful voice of the minority. Writing it has helped me to understand why I initially reacted with such anger.

I care deeply about privacy. I know that when privacy is eroded, it is rarely restored. Of all the big tech companies, Apple is the privacy leader. If Apple isn’t championing on-device privacy, what major company will? If I wanted to leave Apple for a more privacy-focused tech ecosystem, what would that realistically be? I feel like my only Big Tech privacy champion has laid down its sword. And I’m worried that Apple’s leadership in privacy will cause other companies to embrace on-device scanning too.

I hope with all my heart that Apple changes course here. I’ve joined thousands in signing an online petition asking it to do so. If this post has been at all persuasive, I hope you will join me.