Apple introduced new child safety protections to help detect known child sexual abuse material (CSAM) in August of 2021. Little more than a year later, it backed down on its CSAM-scanning plans and now we know why.
Part of Apple’s child protection initiative was to identify known CSAM material before it was uploaded to iCloud Photos, but that proved controversial among privacy advocates who worried that a precedent was being set and that the technology could be misused. Now, it appears that Apple ultimately agreed.
Apple is again being taken to task over its lack of CSAM iCloud protections and that’s triggered the company to explain its thinking, saying that “scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit. »
A child safety group known as Heat Initiative recently sent a letter to Apple CEO Tim Cook to demand the company does more to protect children who are abused, according to a Wired report. Cook didn’t respond, but Erik Neuenschwander, Apple’s director of user privacy and child safety, did.
The response was also provided to Wired which means we now get a better understanding of Apple’s thinking behind its decision to ditch the scanning of CSAM content — despite many misunderstanding how it would actually work.
Alongside creating new thread vectors, Neuenschwander argues that scanning iCloud data « would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”
Apple ultimately chose privacy over trying to prevent known CSAM material from being stored on its servers, it seems. “We decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago,” Neuenschwander reportedly said in the response to Heat Initiative. “We concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users.”
Heat Initiative chief Sarah Gardner called Apple’s decision « disappointing, » telling Cook that “Apple is one of the most successful companies in the world with an army of world-class engineers,” Gardner wrote in a statement to WIRED. “It is their responsibility to design a safe, privacy-forward environment that allows for the detection of known child sexual abuse images and videos. For as long as people can still share and store a known image of a child being raped in iCloud we will demand that they do better.”