The child pornography search tools Apple is going to equip its mobile phones and tablets with will prevent it from accessing other user-generated content. This was announced by one of the leaders of the American company, vice president of software development Craig Federighi.
In an interview with The Wall Street Journal, Federighi commented on user concerns that new features would weaken privacy and could be used by “repressive governments” to access data.
“If and only if you exceed a certain threshold, which is something like 30 matches of known photographic images of child pornography, only then Apple will know something about your account and these pictures, but not about any of your other pictures,” – said the vice -the head of the company.
Apple is going to verify the images uploaded to the “cloud” storage iCloud, with a database of child pornography and, in case of serial matches, report it to the American National Center for Missing and Exploited Children.
As Federighi clarified, the system will not analyze whether the user keeps photos of his child in the bathroom, or if he has any pornographic pictures. “It's literally about finding exact matches with specific known child pornographic images,” he explained.