As relayed by TechCrunch and many other outlets, Apple has announced it’s delaying the release of its CSAM (Child Sexual Abuse Media) scanning features. Here’s the statement Apple PR furnished to media outlets:
Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
The roll-out of these features was utterly botched, from conflating features that should’ve been kept separate to the rolling thunder of damage control as the company carted out technical white papers and FAQ stories and executive interviews to try to explain what they really, actually meant by it. And there appear to be some serious, technical criticisms of the methods Apple is using to detect CSAM media.
It’s all for the best that Apple has decided to delay. Perhaps the company will re-think some aspects of its approach. It’s unlikely that this functionality will simply disappear forever, but after a few months in the penalty box maybe it can emerge alongside a clearer, stronger message. It couldn’t be much worse than this first effort.