Apple SVP Craig Federighi responds to confusion over iOS 15 iCloud child safety policies in new interview

In a video interview with the Wall Street Journal, Apple SVP Craig Federighi discusses the reaction to the iCloud Child Safety features announced last week.

Federighi admits that the simultaneous announcement of the Messages protections for children and CSAM scanning, two similar features but work in very different ways, has caused customer confusion and Apple could have done a better job at communicating the new initiative.

In the discussion with the Journal’s Joanna Stern, Federighi reiterates that the CSAM scanning technology only applies to photos set to be uploaded to iCloud, and is disabled if iCloud Photos is not in use.

The CSAM feature uses cryptographic hashes to detect known child pornographic content on a user’s device. However, the phone does not flag your account immediately in case of false positives.

Instead, it waits until there is enough content being flagged to report the content to Apple for human review, and then ultimately to NCMEC (the National Center for Missing & Exploited Children). Apple had previously nod disclosed how large this threshold is, but Federighi seems to reveal it in the interview saying ‘something on the order of 30 images matching’.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:


You can follow iPhoneFirmware.com on Twitter, add us to your circle on Google+ or like our Facebook page to keep yourself updated on all the latest from Apple and the Web.