Apple will check the correspondence and content of the cloud service iCloud for images of child pornography, the company said. The function will be implemented in new software versions for all devices.
“The Messages app will use the device's built-in machine learning software to alert you to sensitive content while keeping private messages from being read by Apple. iOS and iPadOS will use new cryptographic applications to restrict the distribution of child pornography materials on the Internet, ”it said.
Apple noted that the technology was designed with user privacy in mind, so photos in iCloud will not be scanned. It is about matching a hash, or digital fingerprint, of images with a database of known child sexual abuse images, the company explained.
If an inappropriate photo is sent to the child's device, it will become blurry, and the user will receive a warning. In the case when the child himself wants to send such a photo, he will also be given a warning, and if the photo is sent to the parents, a message will be sent.
In addition, special applications for identifying material in iCloud with child pornography “will help Apple provide law enforcement with valuable information.”
Shortly before Apple's announcement, the Financial Times reported, citing sources that the company plans to install neuralMatch software on iPhones of US users, which allows users to scan images to search for child abuse material.
A number of experts criticized the new development. In particular, associate professor at the Johns Hopkins Institute for Information Security, security expert Matthew Green called it “a really bad idea.” “This kind of tool can be a boon for finding child pornography on phones. But imagine what he can do in the hands of an authoritarian government? ” – he declared.
In turn, a professor of safety engineering at Cambridge University Ross Anderson, in a conversation with the FT, warned that Apple's initiative could lead to “decentralized mass surveillance of phones and laptops.”