Update 20210910
Two days ago the UK Home Secretary Priti Patel said this in regards to Apple
End-to-end encrypted messaging presents a big challenge to public safety, and this is not just a matter for governments and law enforcement. Social media companies need to understand they share responsibility for keeping people safe. They cannot be passive or indifferent about what their products enable or how they might inadvertently blind themselves and law enforcement from protecting children with end-to-end encryption." It goes on this way for a bit, and then "The UK is a global leader in tackling online child sexual abuse. We are bringing in new safety laws via the Online Safety Bill." They then mention Apple explicitly. "Recently Apple have taken the first step, announcing that they are seeking new ways to prevent horrific abuse on their service. Apple state their child sexual abuse filtering technology has a false positive rate of 1 in a trillion, meaning the privacy of legitimate users is protected whilst those building huge collections of extreme child sexual abuse material are caught out. They need to see though that project.
Update 20210903
Apple released an update to their policy stating that this plan has been delayed, but the wording of this update still makes it clear that Apple intends to do this at some point without question.
Update 20210822
The content of this article still represents my views, but it should be noted that a flurry of reforms at tech and finance companies appear to be coming due to part of the United Kingdom's Data Protection Act of 2018. Specifically, the Age Appropriate Design Code which concerns people under 18 and will come into force on 2 September 2021. Similar legal reforms are being considered in many other jurisdictions. These rules having affected payment processors also puts pressure on those companies that might otherwise have ignored the UK. Apple loves to appease governments, so they probably are following the AADC... but they also issue a MasterCard credit card.
Original article content follows
Earlier this month, Apple announced that iOS 15 and iPadOS 15 will introduce a CSAM (Child Sexual Abuse Material) scanning system on iPhones and iPads for any image sent to iCloud. Unlike everyone else, Apple is not scanning for material uploaded to iCloud but instead scanning the photos on the iTrinket. This system is explained somewhat in a PDF file that Apple made publicly available. They are allegedly doing this to save the children or whatever. Nice goal. I agree that sexual abuse of children is evil.
In the introduction on page 3, Apple claims the following
Apple does not learn anything about images that do not match the known CSAM material database
Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account
The risk of the system incorrectly flagging an account is extremely low. In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy
Users can’t access or view the database of known CSAM images
Users can’t identify which images were flagged as CSAM by the system
Apple claims that their system won't generate false positives often, which has been demonstrated to be incorrect. But it claims that once a threshold of detected CSAM material is met, human review takes place. This implies both that Apple knows that false positives will happen, and also that Apple can arbitrarily break the encryption of iCloud when they choose. Apple goes to great lengths to explain away how an image is revealed from iCloud storage, but the complexity of this system was built by them. While ostensibly this complexity was introduced in an attempt to reduce the creepiness factor of what Apple is doing and thereby maintain some sense of privacy for Apple's users, this was simply introduced by Apple being nice. Apple could just as easily not be nice and snag all of your shit.
Also, users apparently cannot make an appeal as a user cannot identify an image that was flagged. Apple, in this case, is saying "you're fucked" but they aren't allowing you to see what triggered their detection. Based upon the NeuralHash collisions that I have seen, this system could be triggered by images that are absolutely nothing like an image in the NCMEC database.
Apple has broken their word.
Apple claims that privacy is a human right. Apple claims that security matters to them. Apple claims so many wonderful things. In this case, Apple has asserted that supposed owners of iPhones do not own their iPhones. Apple owns their iPhones. Apple will, when it deems it necessary, make your iPhone work against you in an attempt to incriminate you. In the past, Apple has complied with government requests from rather brutal regimes, and we are supposed to trust that Apple will only look for kiddie porn? Apple won't check to see if you're a conservative who likes Trump and then turn you in to DHS as a suspected terrorist? Apple won't decide that you maybe said something like "freedom" and then turn you over to the Chinese Communist Party? Apple already censors quite the array of words that can be engraved on devices in China. Apple capitulated to storing all of the user data in China on government servers. Apple has decrypted devices for FBI. I have absolutely no reason to believe that this move by Apple is not an avenue for future abuse, and I have every reason to believe that this technology will be used to further tighten the yoke around people's necks the world over.
If this wasn't enough, Apple is going to be breaking the end-to-end encryption of iMessage "to save the kids" as well. Effectively, Apple will be monitoring the chats that kids have to make sure that kids are not being groomed and to make sure that kids aren't sharing dick pics and the like. Once again, I have no reason to believe Apple when they say that this type of privacy intrusion will be contained to protecting the children.
At this point, there is no tech giant that ensures user privacy, user security, or user ownership of his/her device. Apple was the last one doing anything for their customers and now the iTrinkets too are just surveillance devices for the governments of the Earth.
An array of organizations from across the globe have signed a letter to Tim Cook voicing their concerns over this change in policy at Apple, and Apple has apparently ignored this, and said "Nah, really, we're the good guys. Trust us!" and they've decided to go forward. I suppose that they consider all of these concerned groups and individuals the “screeching minority” who can just simply be ignored.
On Apple's privacy page of their site at the time of this writing, Apple still proclaims that your data is yours.
They even claim that you don't need to share your images with anyone.
I suppose that these bold claims are there only to ameliorate the screeching masses, right? Apparently, there was addendum that I missed and it read
LOL, j/k. We will just snoop whenever the fuck we feel like it
Okay, Apple, you suck.