Apple plans to scan your images for child porn
-
Let's break down a bit from Apple's FAQ:
"Can the CSAM detection system in iCloud Photos be used to detect
things other than CSAM?Our process is designed to prevent that from happening."
This is an absolute "yes". In no way do they say "no", because they can't. Just like how "using a password" is designed to prevent the wrong people from accessing a system, we know that when you say "process is designed for..." is anything but "will only do..."
" CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations."
So we know that private companies in the US can be forced to do anything by the government, and we know that non-profits tend heavily towards corruption, and we know that non-profits can trivially be hacked. So claiming that "likely totally corrupt, insecure, private companies with zero oversight or security or skill" will provide the list of data is, to me, tantamount to advertising that far more than just the government will be in a position to control what is being scanned for. Could this be more obvious that there is a giant gaping security hole here in the system as designed!
There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, in-
cluding the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities. "The reporting is not automatic, and yet by law it has to be. If Apple believes that something has been found, they must report. "Automatic" here might not mean that it is done by a computer system, but their process, by law, must be automatic.
-
More lies from the FAQ:
"Could governments force Apple to add non-CSAM images to the
hash list?Apple will refuse any such demands."
So Apple is claiming that they will break the law, and just go rogue and somehow not be coerced? Um, this is the same as saying all agreements and contracts and promises with them are void. If they are above the law, and the law makes their agreements with the public valid, Apple is announcing here that they are under no obligation to do anything that they have promised us. So all bets are off and they've said so.
"Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. "
This is known to be computationally impossible and they are clearly insulting the intelligence of their users to make such an impossible and ridiculous claim. It's impossible to make technology that can only scan for one thing in this manner and they know it, and so does anyone who reads this.
"We have faced demands to build and deploy government-man-dated changes that degrade the privacy of users before, and have steadfastly refused those
demands. We will continue to refuse them in the future."Sure, this is likely true. But have their also caved to demands? We will never know. This isn't something they, or we, can prove. We just have to take the words of someone bold faced lying to us says.
"Let us be clear, this technology is limit-ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. "
So our only protection is that liars might this one time be telling the truth?
-
And this doesn't even begin to cover the fact that Apple has now officially created a mechanism by which they can simple claim that a number of false positives existed and manually access iCloud accounts and poke around. Of course, since iCloud is not encrypted, they've always had the ability to do this or to be coerced to do this. But there has always been a threat of discovery which keeps them from doing so casually in a way that could ever be discovered and exposed.
But now Apple can openly peer into and share with the government anything that they want and make the unprovable claim that a threshhold was detected and the account was made accessible and that they were looking for CSAM images as part of their process and "just happened to find" data exposing a government official of misconduct or whatever.
Very handy how the controls, both technical and social, are being dismantled here.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
More lies from the FAQ:
"Could governments force Apple to add non-CSAM images to the
hash list?Apple will refuse any such demands."
So Apple is claiming that they will break the law, and just go rogue and somehow not be coerced? Um, this is the same as saying all agreements and contracts and promises with them are void. If they are above the law, and the law makes their agreements with the public valid, Apple is announcing here that they are under no obligation to do anything that they have promised us. So all bets are off and they've said so.
"Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. "
This is known to be computationally impossible and they are clearly insulting the intelligence of their users to make such an impossible and ridiculous claim. It's impossible to make technology that can only scan for one thing in this manner and they know it, and so does anyone who reads this.
"We have faced demands to build and deploy government-man-dated changes that degrade the privacy of users before, and have steadfastly refused those
demands. We will continue to refuse them in the future."Sure, this is likely true. But have their also caved to demands? We will never know. This isn't something they, or we, can prove. We just have to take the words of someone bold faced lying to us says.
"Let us be clear, this technology is limit-ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. "
So our only protection is that liars might this one time be telling the truth?
TL:DR The government has some dirt on us, and in order to not let that cat out of the bag, we're selling you bitches out.
Thanks for all the fish, bye.
-
-
https://appleinsider.com/articles/21/08/17/germany-writes-to-tim-cook-to-reconsider-csam-plans
The quoted from the German government are priceless.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
https://appleinsider.com/articles/21/08/17/germany-writes-to-tim-cook-to-reconsider-csam-plans
The quoted from the German government are priceless.
That's actually a really good response from what appears to be a competent government.
-
Man from cover of Nevermind suing to classify the album as child pornography. So, does ownership of this album get your Apple flag limit up? Already major problems with this kind of scanning coming to light.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
Man from cover of Nevermind suing to classify the album as child pornography. So, does ownership of this album get your Apple flag limit up? Already major problems with this kind of scanning coming to light.
That sounds like scamming, but it's interesting to see that BBC cropped the image from the Nevermind album in the article. Meaning they chose to not show the entire image from the album cover.
However he also said that there were no model release signed. And that is a whole different matter because that means the record company didn't have the right to use the image on the album cover.
-
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed. But yes, he claims. If there was one, you'd think that there would be public pushback as that's a big deal given the circumstances.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed.
True. But pornography is not equal to nudity. No matter what the Puritanical fucking United States tries to say.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed.
True. But pornography is not equal to nudity. No matter what the Puritanical fucking United States tries to say.
But it's puritanical non-profits of unknown providence and no listed oversight who get to determine what is used by Apple. Apple itself has said it will apply no oversight and just blindly accept what some unlisted non-profits give it.
So considering that there is no standard even suggested to be applied by Apple, a simple claim like this is easily enough to meet their requirements given what Apple has publicly stated. It might not as well. We have no means of knowing as the requirements are not Apple's, but third party requirements, and any oversight of those is unlisted and unknown.
-
@pete-s said in Apple plans to scan your images for child porn:
That sounds like scamming,
You are not the only one that thinks that.
he was happy to dine out on having been Nirvana Baby until about 10 seconds ago. In 2016, in a 25th anniversary recreation of the photo shoot, he even volunteered to do it naked again, before thinking better of the idea and posing in swim trunks.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
That sounds like scamming,
You are not the only one that thinks that.
he was happy to dine out on having been Nirvana Baby until about 10 seconds ago. In 2016, in a 25th anniversary recreation of the photo shoot, he even volunteered to do it naked again, before thinking better of the idea and posing in swim trunks.
Certainly that's a popular, and potentially valid, opinion. And, haha, "poster child".
But it doesn't change the fact that a child was photographed naked and made into a commercial enterprise. I don't see any world where that should be okay. That someone feels that his parents should be allowed to sign a release for that I think is a huge problem. That's no different than how parents here look the other way in child marriages where girls far too young to have any decision making ability are given away to men in their 30s and 40s. Illegal... unless the parents agree.
The "unless parents agree" sounds not so bad when you grow up in rich western Europe, USA, or similar and live in average or above income levels. But in many cultures, and in many places with poverty, parents having their children's interest at heart is often not the case. They don't want to hurt them, but there is a price on everything about them. That's why parents routinely sell their children to slavers.
So no matter what, in my opinion, what was done was wrong. No amount of him being okay with it now matters. No amount of his parents agreeing matters. It was porn, it was wrong, it will always be wrong.
It's like a priest molesting a child and then the child repressing it or coming to terms with having been violated when they had no control. Even worse, the parents were involved! That kind of psychological trauma can never be overlooked. You have a lifetime of having to "deal with" the fact that it happened. You can't be also forced to be traumatized every day to prove that it was bad "for you".
In this case, he's a victim. That's real. Calling it victim culture is victim shaming. That's also real. There's no way to say he was "okay with it" in the past. It is who he is, he HAS to live with it. It's like saying an amputee's pain and anguish isn't valid because they come to terms with their injury by laughing about it or being happy. The person who chopped off their arm shouldn't be excused from responsibility just because the victim isn't suffering even more than necessary. He can't go back and make it not happen. He is forced to make do with the situation that he is in.
Is it the worst thing that has ever happened? Obviously not. Has it crippled his life? We have no way to know. Was it sick, pointless, and wrong? Yes, yes it was and I see no grey area on this. I think that using child nudity for commercial gain is a serious problem whether and that fame and popularity of an artist should not be grounds for looking the other way or attempting to shift blame onto someone not given a choice in the matter.
-
And the pile on officially begins.
https://9to5mac.com/2021/09/09/csam-scan-encrypted-messages/
-
@jaredbusch said in Apple plans to scan your images for child porn:
And the pile on officially begins.
https://9to5mac.com/2021/09/09/csam-scan-encrypted-messages/
Now we know how bad it is if the UK is supporting it! The ultimate surveillance state.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
And the pile on officially begins.
https://9to5mac.com/2021/09/09/csam-scan-encrypted-messages/
Now we know how bad it is if the UK is supporting it! The ultimate surveillance state.
China is way worse...
-
@dustinb3403 said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
And the pile on officially begins.
https://9to5mac.com/2021/09/09/csam-scan-encrypted-messages/
Now we know how bad it is if the UK is supporting it! The ultimate surveillance state.
China is way worse...
yeah, but China is not considered part of the "free world"... and frankly, must of what was once thought to be so, is moving towards not being so.
-
@dustinb3403 said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
And the pile on officially begins.
https://9to5mac.com/2021/09/09/csam-scan-encrypted-messages/
Now we know how bad it is if the UK is supporting it! The ultimate surveillance state.
China is way worse...
Sort of. It depends on what you measure. Camera, they are worse (per capita, for real.) But when it comes to far more intrusive data, they are less.