Amid controversy surrounding Apple’s CSAM detection system, a San Francisco Bay Area doctor has been charged with possessing child pornography in his Apple iCloud account, according to federal authorities.
The U.S. Department of Justice announced Thursday that Andrew Mollick, 58, had at least 2,000 sexually exploitative images and videos of children stored in his iCloud account. Mollick is an oncology specialist affiliated with several Bay Area medical facilitates, as well as an associate professor at UCSF School of Medicine.
Additionally, he uploaded one of the images to social media app Kik, according to the recently unsealed federal complaint (via KRON4).
Apple has recently announced plans to introduce a system designed to detect child sexual abuse material (CSAM) in iCloud and provide a report to the National Center for Missing and Exploited Children (NCMEC). The system, which relies on cryptographic techniques to ensure user privacy, has caused controversy among the digital rights and cybersecurity communities.
The system does not scan actual images on a user’s iCloud account. Instead, it relies on matching hashes of images stored in iCloud to known CSAM hashes provided by at least two child safety organizations. There’s also a threshold of at least 30 pieces of CSAM to help mitigate false positives.
Documents revealed during the Epic Games v. Apple trial indicated that Apple anti-fraud chief Eric Friedman thought that the Cupertino tech giant’s services were the “greatest platform for distributing” CSAM. Friedman attributed that fact to Apple’s strong stance on user privacy.
Despite the backlash, Apple is pressing forward with its plans to debut the CSAM detection system. It maintains that the platform will still preserve the privacy of users who do not have collections of CSAM on their iCloud accounts.