Apple’s Employees Are Reportedly Concerned About Its CSAM Detection Methods

Apple’s Employees Are Reportedly Concerned About Its CSAM Detection Methods

Apple’s new CSAM detection plans are supposed to help it weed out child exploitation images that are uploaded to iCloud Photos, but privacy advocates around the world have already spoken out against the steps Apple will take. Now, a new report says that Apple’s own employees have misgivings, too.

Apple’s CSAM methods will involve it checking all photos uploaded to iCloud, comparing a neural hash of those images with known CSAM content. The first part of that check will be carried out on iPhones and iPads, and it’s that which has people concerned.

A new Reuters report says that Apple employees have taken to Slack to voice their concerns.

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

Much of the concern revolves around whether a government could force Apple to check for other content, not just CSAM. Apple says that isn’t something that could happen, but that hasn’t been enough to satiate skeptics so far.

The fact Apple is now going to have to convince its own people, as well as the public, is something the company could do without as the iPhone 13 announcement draws ever closer.

You may also like to check out:


You can follow iPhoneFirmware.com on Twitter, add us to your circle on Google+ or like our Facebook page to keep yourself updated on all the latest from Apple and the Web.