Apple announced last week a new detection system for CSAM (Child Sexual Abuse Material) in iCloud, a measure many users have seen as the end of the privacy to which Apple has us accustomed. If you have any questions about its operation, you can visit this link where we explain what is this system and how does it really work.
While controversy over this system continues, and some rights associations invite Apple not to implement it, a San Francisco Bay doctor has been accused of possess child pornography on your Apple iCloud account.
The U.S. Department of Justice announced Thursday that 58-year-old Andrew Mollick had at least 2.000 images and videos of the sexual exploitation of children. stored in your iCloud account. Andrew Mollick is an oncology specialist affiliated with several San Francisco Bay Area medical centers as well as an associate professor at the UCSF School of Medicine.
The detainee uploaded one of those images to the social media application Kik, it was then that the federal police carried out an investigation to find him. At that moment, it was when the feds found all the content that he had stored in iCloud.
Apple keeps going despite controversy
Apple recently announced plans to introduce a system designed to detect child sexual abuse material on iCloud and provide a report to the National Center for Missing and Exploited Children (NCMEC).
The system does not scan images from user's iCloud account. Instead, it checks the hashes (identifiers) of images stored in iCloud against known CSAM hashes provided by child safety organizations. .
Despite the backlash, Apple is pushing ahead with its plans to debut the CSAM detection system. Maintains that the platform will continue to preserve privacy of users who do not have CSAM collections in their iCloud accounts.