Apple intends to begin installing software on iPhones in the United States that will automatically scan local photographs for child abuse images. According to reports, Apple will only use this technology to scan images uploaded to iCloud at first.
The Financial Times and Johns Hopkins University Professor Matthew Green broke the news on Apple’s plans. Note that this won’t happen so we have to wait for confirmation from Apple. Apple apparently showed the proposal to a group of US academics earlier this week.
According to the claims, Apple will scan American iPhones for child abuse imagery using a method called “neuralMatch.”
According to individuals present at the meeting, every photo uploaded to iCloud is assigned a “safety voucher,” which indicates whether the shot is suspect or not. When a certain number of photos are flagged as suspicious, Apple will decrypt them and submit them to police if any evidence of child abuse is discovered.
In essence, if an automated system detects illicit images, it will notify a team of human reviewers. Following that, a team member would evaluate the image and contact law authorities.
This type of scanning is already performed by cloud-based photo storage systems and social networks. The difference here is that Apple is doing it on a device-by-device basis. According to Matthew Green, at first, it will only scan photos uploaded to iCloud, but it will do so on the user’s phone. At some point, the software might be used to scan all photos locally.