Apple has released details of its plan to install a special program to combat sexual abuse against minors on the iPhones of Americans.
When any photo or video is uploaded to the “cloud” iCloud, the program will automatically compare it with the already known obscene images of children (child sex abuse material, abbreviated CSAM).
- The US authorities believe that Apple is promoting censorship and harassment in Russia. Is it so?
- Coronavirus: Apple and Google told how they will monitor contacts of patients
If an alarm signal is received, it will be checked by a human controller, who, if necessary, will inform the law enforcement agencies.
There are concerns that the technology could be used to track any prohibited content, including political content, especially in countries with authoritarian regimes.
Apple says the latest versions of iOS and iPadOS, due before the end of this year, will include new encryption tools to “help limit the spread of CSAM while protecting users' privacy.”
- “The internet is good for porn.” How pornography helped advance technology
The images will be compared with a database created by the US National Center for Missing and Exploited Children and other juvenile rights organizations.
According to Apple, the program will also be able to capture edited versions of known images.
“High accuracy”
The corporation insists that its program is extremely accurate, with a statistical chance of error equal to one in a trillion cases per year.
In addition, the messages of the system will be checked manually, and only a person will be able to block the user's account and transfer his data to the police.
- “The main thing is not to vomit.” How not to go crazy moderating social media
The company says that the new applications provide even higher levels of user privacy than before, and that they will not collect any data about photos on iPhones if they do not match the CSAM database.
However, some experts have expressed doubts.
“Regardless of what Apple actually plans, they sent an unambiguous signal that they thought (highly influential) it was okay to build systems to scan phones for prohibited content,” says Matthew Green of Johns Hopkins University. “The dam will be breached. Governments will begin to demand it from everyone. “