90+ groups demand Apple drops plans to check iPhones for CSAM content

A group of more than 90 international policy groups has banded together to deliver an open letter to Apple CEO Tim Cook that demands Apple ditches its plans to check iPhones and iPads for known CSAM content in iCloud Photos and inappropriate photos sent to and from kids.

In an announcement headed “International Coalition Calls on Apple to Abandon Plan to Build Surveillance Capabilities into iPhones, iPads, and other Products,” the groups appear to have two main issues with Apple’s new child safety plans.

In particular:

  • The scan and alert feature in Messages could result in alerts that threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents are particularly at risk.

  • Once the CSAM hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable.

First reported by Reuters, the letter can be found in full online now, but it’s important to note that both of the issues raised here have already been covered by Apple.

On the first issue, alerts will only be triggered based on images sent and received via the Messages app. No text messages will trigger any kind of alert whatsoever. Further, Apple also confirmed during press briefings that kids will be warned before any sort of parental notification is triggered. They’ll need to expressly click through that warning to see the photo in question, having been told that their parents will be notified. Parents won’t be notified of anything without a child’s knowledge.

VPN Deals: Lifetime license for $16, monthly plans at $1 & more

On the second issue, Apple has repeatedly said that it will not be swayed by governments and law enforcement if and when demands are made to use the CSAM detection system to detect other types of material. Apple also points to the fact the hashes to which iCloud Photos are matched are only provided by known child protection agencies. What’s more, all of this is auditable, says Apple.

Despite this, the coalition believes that Apple will be “installing surveillance software” on iPhones — something Apple will no doubt strongly refute.

Read more at iMore.

Scroll to Top
Skip to content