Apple will scan your iCloud Photo Uploads for Child Abuse images


Apple will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.

The system, as per a Financial Times report (paywalled), is called neuralMatch. It aims to leverage a team of human reviewers to contact law enforcement authorities when it finds images or content relating to Child Sexual Abuse Material (CSAM). The said system was reportedly trained using 200,000 images from the National Center for Missing and Exploited Children. As a result, it will scan, hash, and compare the photos of Apple users with a database of known images of child sexual abuse.

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities,” said the Financial Times report.

detect CSAM images stored in iCloud Photos.

Now, following the report, Apple published an official post on its Newsroom to further explain how the new tools work. These tools are developed in collaboration with child safety experts and will use on-device machine learning to warn children as well as parents about sensitive and sexually explicit content on iMessage.