Jump in the discussion.

No email address required.

Its a pretty gross overstep since Apple champions themselves for being consumer privacy conscious. However they are only scanning photos being uploaded to iCloud. The EARN IT Act and similar legislation in other countries are making content hosts responsible for child sexual abuse material on their servers so companies that store data are forced to do this or use some similar approach. AWS, Azure, and Google all have similar scans for CSAM. The main issues are that the method being used is easy to frick with in order to cause false positives and that governments could now request other material be scanned for on iCloud since the tool exists and the library of material is editable.

https://en.wikipedia.org/wiki/EARN_IT_Act_of_2020

Here’s a link for information on the image hashing used and the flaws of the system:

https://towardsdatascience.com/black-box-attacks-on-perceptual-image-hashes-with-gans-cc1be11f277

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.