Apple is killing its plan to scan your photos for CSAM. Here is the continuation

In August 2021, Apple has announced a plan to scan photos users have stored in iCloud for child sexual abuse material (CSAM). The the tool was supposed to preserve privacy and allow the company to flag potentially problematic and abusive content without revealing anything else. But the initiative was controversial and she quickly drew a lot of criticism privacy and security researchers and digital rights groups who feared that the surveillance capability itself was being misused to undermine the privacy and security of iCloud users around the world entire. Beginning of September 2021, Apple said it would pause the rollout of the feature to “gather feedback and make improvements before releasing these critically important child safety features.” In other words, a launch was still to come. Now the company says that in response to feedback and advice it received, CSAM’s detection tool for iCloud Photos is dead.

Instead, Apple told WIRED this week that it is focusing its anti-CSAM efforts and investments on its “communications security” features, which the company originally announced in August 2021 and launched last December. Parents and caregivers can opt in to protections through iCloud Family Accounts. Features work in Siri, Apple’s Spotlight search and Safari search to notify if someone is viewing or searching for child sexual abuse material and provide on-site resources to report content and seek assistance . Additionally, the heart of the protection is communication security for messages, which caregivers can configure to provide warning and resources to children if they receive or attempt to send photos containing nudity. The objective is to stop the exploitation of children before it occurs or takes root and to reduce the creation of new CSAMs.

“After extensive consultation with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the communications security functionality we made available for the first time. times in December 2021,” the company told WIRED in a statement. “We have further decided not to move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue to work with governments, child advocates and other businesses to help protect young people, uphold their right to privacy and make the internet a safer place for children and for all of us.”

Apple’s CSAM update is coming alongside its announcement today that the company is significantly expanding its end-to-end encryption offerings for iCloud, including adding protection for backups and photos stored on the cloud service. Child safety experts and technologists working to combat CSAM have often opposed the wider deployment of end-to-end encryption because it makes user data inaccessible to tech companies, making it harder for them to to scan and report the CSAM. Law enforcement agencies around the world have also cited the serious problem of child sexuality abuse in opposing the use and expansion of end-to-end encryption, although many of these agencies have historically been hostile towards end-to-end encryption in general, as this can make some investigations more difficult. The research has regularly showhowever, this end-to-end encryption is a essential security tool for the protection of human rights and that the disadvantages of its implementation do not outweigh the advantages.

Communications Security for Messages is optional and analyzes attached images that users send and receive on their devices to determine if a photo contains nudity. The feature is designed so that Apple never has access to Messages, the end-to-end encryption offered by Messages is never broken, and Apple doesn’t even learn that a device has detected nudity.

The company told WIRED that while it’s not ready to announce a specific timeline for expanding its communications security features, the company is working on adding the ability to detect nudity in uploaded videos. via Messages when protection is activated. The company also plans to expand the offering beyond Messages to its other communication apps. Ultimately, the goal is to allow third-party developers to integrate communications security tools into their own applications. The more features can proliferate, Apple says, the more likely children will get the information and support they need before they are taken advantage of.

Leave a Reply

Your email address will not be published. Required fields are marked *