Apple has announced that it will delay its plan to roll out Child Sexual Abuse Material (CSAM) for the iPhone in the US after much criticism. and making improvements before these critical child safety features are launched.”
The CSAM scan feature was announced by Apple last month, as it will come with the iOS 15 update, through which it can identify child pornography images stored on iPhones, while Apple will expand this feature to include iPad, Apple Watch, and Mac computers as well, and when it detects child pornography. If any of the Apple devices displays images related to child pornography or child abuse, the Apple device will automatically blur the content and this will be reported to Apple’s servers, and this feature will be available in the United States, and for iPhone users in the United States, once it detects child abuse content, it will Apple automatically alerts the National Center for Missing and Exploited Children (NCMEC) and law enforcement agencies using the user’s Apple ID.
Cyber security and privacy enthusiasts are concerned about this new feature because they feel if Apple can detect child porn in iPhone users with such accuracy, what is stopping Apple from searching for content related to political activity or opposition, and governments could force Apple in the future to intrude on Potential political opponents, demonstrators and whistleblowers.
Apple will use on-device matching technology using a database of known hashs of child abuse images provided by NCMEC and other child safety organizations, and before the image is stored in iCloud Photos, an on-device matching of that image is performed against known CSAM hashes.
It uses an encryption technique called private group intersection, which determines whether there is a match without revealing the result. The device generates an encrypted security voucher that encodes the match result as well as additional encrypted data about the photo, and this receipt is uploaded to iCloud Photos with the photo.