Last month, Apple announced a handful of new child safety features that proved to be controversial, including CSAM detection for iCloud Photos. Now, Apple has said they will “take additional time” to refine the features before launching to the public.
In a statement to 9to5Mac, Apple said:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple’s new child safety features were set to launch as part of updates to iOS 15, iPadOS 15, and macOS Monterey later this year. There is now no word on when the company plans to roll out the features. Apple’s statement today does not provide any details on what changes the company could make to improve the system.
As a refresher, here’s the basics of how the CSAM detection system would work as currently-designed:
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
Upon announcement, the new CSAM detection technology received quite a bit of pushback and criticism from privacy advocates. Apple, however, doubled down on the feature multiple times, and said that its implementation would actually be more privacy-preserving than technology used by other companies like Google and Facebook.
It was also revealed through this process that Apple already scans iCloud Mail for CSAM, with the expansion applying to iCloud Photos.
Other child safety features announced by Apple last month, and also now delayed, include communications safety features in Messages and updated knowledge information for Siri and Search.
What do you make of Apple’s decision to delay the rollout of its new child safety features? Is it the right decision, or should the company have stuck to its initial plan?
Check out 9to5Mac on YouTube for more Apple news: