The American company Apple has revealed the details of a new function that has already caused a lot of controversy, within which the manufacturer will scan all the photos of users of its equipment stored in the corporate cloud service iCloud.
According to Apple employees, the company will only check personal galleries for the purpose of detecting photographs containing scenes of child abuse (CSAM). The files will be hashed, and the resulting “digital prints” will then be compared with the database of the National Center for the Search for Missing and Exploited Children (NCMEC).
At the same time, the manufacturer promises not to copy or transfer user photos to third-party servers. Apple says its system is capable of detecting illegal content with an accuracy of less than one false positive per trillion checks. The company also said that if the system is triggered, the suspicious content will first undergo additional verification, and only after that the information will be transferred to law enforcement agencies.
Meanwhile, a tablet appeared in Russia for only 5990 rubles.
Source: Apple
Subscribe to our Zen channel not to miss the most interesting news.