Apple backed out of a controversial child protection feature and now we know why

Apple introduced new child safety protections to help detect known child sexual abuse material (CSAM) in August of 2021. Little more than a year later, it backed down on its CSAM-scanning plans and now we know why.

Part of Apple’s child protection initiative was to identify known CSAM material before it was uploaded to iCloud Photos, but that proved controversial among privacy advocates who worried that a precedent was being set and that the technology could be misused. Now, it appears that Apple ultimately agreed.

Soyez le premier à commenter

Poster un Commentaire

Votre adresse de messagerie ne sera pas publiée.