Jump in the discussion.

No email address required.

Its a pretty gross overstep since Apple champions themselves for being consumer privacy conscious. However they are only scanning photos being uploaded to iCloud. The EARN IT Act and similar legislation in other countries are making content hosts responsible for child sexual abuse material on their servers so companies that store data are forced to do this or use some similar approach. AWS, Azure, and Google all have similar scans for CSAM. The main issues are that the method being used is easy to frick with in order to cause false positives and that governments could now request other material be scanned for on iCloud since the tool exists and the library of material is editable.

https://en.wikipedia.org/wiki/EARN_IT_Act_of_2020

Here’s a link for information on the image hashing used and the flaws of the system:

https://towardsdatascience.com/black-box-attacks-on-perceptual-image-hashes-with-gans-cc1be11f277

Jump in the discussion.

No email address required.

Jump in the discussion.

No email address required.

Worse, a ph*ne poster

Jump in the discussion.

No email address required.

broken hyperlink

Jump in the discussion.

No email address required.

Oh, poor sad angry little poodle yapping away its aggression on the internet. How bloody typical. If I were a better person I'd pity you. It is, however, not my fault that you are uneducated and have to fall back on silly childish insults. As it is you have provided amusement my little ankle-biter. Looking at the nonsense it seems that you are in one heck of a lot more misery than I will ever be my angry little brat. You should thank me for letting you get your anger out in a safe place. I thank you for the giggle.


Snapshots:

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.