When this technology called “Neural Match” is launched, the photos will be scanned before they are saved to the iCloud storage platform.
In the event that this automated examination alerts to the existence of a large match between the images that are intended to be stored, and the images of child abuse found in DatabaseAt that time, Apple has the right to review the matter through its staff.
If it is confirmed that it is really child pornography, the company “Apple” suspends the user’s account, and then informs the US National Center for Missing and Exploited Children.
But the company “Apple” will rely only on the comparison with the images in the database of the US National Center For children The missing and the exploited.
Experts say that this technology is accurate, because it relies on a database in an official institution, and therefore, parents who take pictures of their young children in the shower do not need to worry, because these pictures will not be viewed as pornographic content.
This automated system does not look at the images directly, but rather relies on mathematical fingerprints representing the content or the image in order to detect any match.
But the problem, according to activists, is that this system that protects children may be harnessed to harm users.
And Matthew Green, a researcher in cryptography at Johns Hopkins University, explains that this method may be used in order to harm innocent people, by sending seemingly innocent images, but it will automatically indicate a match with pornographic images, if they are examined by the “Apple” system. “.
In addition to the “Neural Match” technology, “Apple” tends to examine users encrypted messages via the “iMessage” platform, in order to automatically track pornographic images, which allows parents to launch a feature that automatically deletes this harmful content from their children’s phones.