Apple CSAM system tricked, but company says it guards against this

An early version of the Apple CSAM system has effectively been tricked into flagging an innocent image, after a developer reverse-engineered part of it. Apple, however, says that it has additional protections to guard against this happening in real-life use.

The latest development occurred after the NeuralHash algorithm was posted to the open-source developer site GitHub, enabling anyone to experiment with it…

Background

All CSAM systems work by importing a database of known child sexual abuse material from organizations like the National Center for Missing and Exploited Children (NCMEC). This database is provided in the form of hashes, or digital fingerprints, derived from the images.

While most tech giants scan uploaded photos in the cloud, Apple uses a NeuralHash algorithm on a customer’s iPhone to generate hashes of the photos stored and then compare this against a downloaded copy of the CSAM hashes.

A developer yesterday claimed to have reverse-engineered Apple’s algorithm, posting the code to GitHub – a claim that Apple effectively confirmed.

Apple CSAM system tricked

Within hours of the GitHib posting, researchers succeeded in using the algorithm to create a deliberate false positive – two completely different images that generated the same hash value. This is known as a collision.

Collisions are always a risk with such systems as the hash is of course a greatly simplified representation of the image, but surprise was expressed that someone was able to generate one so quickly.

The collision deliberately created here is simply a proof of concept. Developers have no access to the CSAM hash database, which would be required to create a false positive in the live system, but it does prove that collision attacks are relatively easy in principle.

Apple says it has two protections against this

Apple effectively confirmed that the algorithm was the basis for its own system, but told Motherboard that it is not the final version. The company also said it was never intended to be secret.

Apple told Motherboard in an email that that version analyzed by users on GitHub is a generic version, and not the one final version that will be used for iCloud Photos CSAM detection. Apple said that it also made the algorithm public.

“The NeuralHash algorithm [… is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described,” one of Apple’s pieces of documentation reads. 

The company went on to say there are two further steps: a secondary (secret) matching system run on its own servers, and a manual review.

Apple also said that after a user passes the 30-match threshold, a second non-public algorithm that runs on Apple’s servers will check the results.

“This independent hash is chosen to reject the unlikely possibility that the match threshold was exceeded due to non-CSAM images that were adversarially perturbed to cause false NeuralHash matches against the on-device encrypted CSAM database.”

Finally, as previously discussed, there is a human review of the images to confirm that they are CSAM.

The only real risk, says one security researcher, is that anyone who wanted to mess with Apple could flood the human reviewers with false-positives.

“Apple actually designed this system so the hash function doesn’t need to remain secret, as the only thing you can do with ‘non-CSAM that hashes as CSAM’ is annoy Apple’s response team with some garbage images until they implement a filter to eliminate those garbage false positives in their analysis pipeline,” Nicholas Weaver, senior researcher at the International Computer Science Institute at UC Berkeley, told Motherboard in an online chat.

You can read more about the Apple CSAM system, and the concerns being raised, in our guide.

Photo: Alex Chumak/Unsplash

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:


You can follow iPhoneFirmware.com on Twitter, add us to your circle on Google+ or like our Facebook page to keep yourself updated on all the latest from Apple and the Web.