Apple plans to scan your images for child porn
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
This is a tool that tells the government when it wants an upload triggered.
No this is not how it works at all. I'm saying it's the opposite. YOU do the upload. The tool only scans and packages what you are already uploading.
You can say that, but there's nothing to make that true. The scanning happens separate from any upload. The results of the scan get shared to Apple via the upload. The fact that right now uploads are triggered by the end user (or automated as actually happens) is the same as the government being able to force it. Anything the end user can do on an online connected device the government can coerce with a warrant.
So saying "you upload it" and "the government can collect it at will" are synonymous.
Did you even read the whitepaper?
Apple's details are SUPER clear that the scanning is on device, and there is no possibility of them protecting against a warrant forcing an upload. There's no mechanism to stop that. So now that they can provide your data to the government, the government now has the legal ability to force it. The one creates the other.
-
In other situations, like Telegram or Signal, data is fully encrypted in a way that the vendor has no access to it, no possible access to it. That's the trick to stop the government and that's why they do that. By not giving themselves access, they can't be forced to act maliciously for an authoritarian regime to scare, threaten, or attack political dissidents.
But Apple is leaving out this protection and making it that they can be forced to gather information. Anything about what they "won't" do is false, as those are statements that Apple cannot make. They don't have the legal right to make those statements. Claiming that they will break the law if asked to obey a warrant is an automatically false contract because contracts cannot be written that claim to deny the law. So Apple knows that they can claim it, because by doing so, they have no obligations to do so.
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.
Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.
The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.
Another confirmation:
Nothing that this is known to be false. The statement, which defies the official statement from Apple, is a lie because in order to do what they say they have to scan to know that data is from those sources. And notice it is "set to be" uploaded, not uploaded or uploading. This is a pretty ambiguous phrase that results in "can scan anytime."
The second part is "only for certain files." You have to scan everything to find something that you are looking for. It's physically impossible to only search files that are of a certain list because you can't know that they are on that list until after having already scanned them.
So whatever source you have for this was already stated as being false previously. That all files must be scanned has been said since the first moments of this discussion.
Whoever posted this clearly thought that people were pretty gullible and wouldn't think about this at all. That people are trying to defend Apple with such obvious lies makes it so much more obvious how bad it is and how bad they all know it is.
-
Any claim from Apple that says that they "won't" do something demanded by a government that they "can" do is a lie. Apple has no power to say no to that. It's not within their power, the government can seize the entire company if they want. Denying that kind of action is like saying you won't die if hit by a bullet, you promise.
Apple is basically resorting to just saying any crap hoping that their fanboy culture will protect them. But it isn't working. People are pointing out that it's obviously false and has no possibility of being true. Even Apple's own teams are speaking out now.
You can't use claims. If you want to say that there is some protection, you have to show how there is a technology that makes it impossible for Apple to do these things. If such a thing existed, Apple would be touting it all over the place. They are not, because there is not. Apple isn't claiming that they can't be forced to do these things, just that they will deny the government somehow which is a claim anyone can make, but no one can make honestly.
-
Apple claims that they will additionally scan iCloud. Not files as uploaded, but all files including those already there. Not a big deal as iCloud has never been private unless encrypted, obviously. But this is something most people are also missing. It's a lot of scanning.
-
https://www.washingtonpost.com/opinions/2021/08/13/apple-csam-child-safety-tool-hashing-privacy/
Great point in this article. By scanning, Apple is required by law to report. The only protection that Apple has is be keeping the data private, from themselves. So there is no logic to Apple opening themselves to prosecution for looking, but not reporting, unless there is something far bigger afoot.
Keep in mind, the CSAM laws already exist that if Apple scans that they have to report, even if they just scan on your device. They aren't allowed to wait for the iCloud upload to "maybe" happen. This is an existing law and Apple is obviously counting on this and ignoring it in the hopes that people don't bring it up.
-
Let's break down a bit from Apple's FAQ:
"Can the CSAM detection system in iCloud Photos be used to detect
things other than CSAM?Our process is designed to prevent that from happening."
This is an absolute "yes". In no way do they say "no", because they can't. Just like how "using a password" is designed to prevent the wrong people from accessing a system, we know that when you say "process is designed for..." is anything but "will only do..."
" CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations."
So we know that private companies in the US can be forced to do anything by the government, and we know that non-profits tend heavily towards corruption, and we know that non-profits can trivially be hacked. So claiming that "likely totally corrupt, insecure, private companies with zero oversight or security or skill" will provide the list of data is, to me, tantamount to advertising that far more than just the government will be in a position to control what is being scanned for. Could this be more obvious that there is a giant gaping security hole here in the system as designed!
There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, in-
cluding the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities. "The reporting is not automatic, and yet by law it has to be. If Apple believes that something has been found, they must report. "Automatic" here might not mean that it is done by a computer system, but their process, by law, must be automatic.
-
More lies from the FAQ:
"Could governments force Apple to add non-CSAM images to the
hash list?Apple will refuse any such demands."
So Apple is claiming that they will break the law, and just go rogue and somehow not be coerced? Um, this is the same as saying all agreements and contracts and promises with them are void. If they are above the law, and the law makes their agreements with the public valid, Apple is announcing here that they are under no obligation to do anything that they have promised us. So all bets are off and they've said so.
"Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. "
This is known to be computationally impossible and they are clearly insulting the intelligence of their users to make such an impossible and ridiculous claim. It's impossible to make technology that can only scan for one thing in this manner and they know it, and so does anyone who reads this.
"We have faced demands to build and deploy government-man-dated changes that degrade the privacy of users before, and have steadfastly refused those
demands. We will continue to refuse them in the future."Sure, this is likely true. But have their also caved to demands? We will never know. This isn't something they, or we, can prove. We just have to take the words of someone bold faced lying to us says.
"Let us be clear, this technology is limit-ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. "
So our only protection is that liars might this one time be telling the truth?
-
And this doesn't even begin to cover the fact that Apple has now officially created a mechanism by which they can simple claim that a number of false positives existed and manually access iCloud accounts and poke around. Of course, since iCloud is not encrypted, they've always had the ability to do this or to be coerced to do this. But there has always been a threat of discovery which keeps them from doing so casually in a way that could ever be discovered and exposed.
But now Apple can openly peer into and share with the government anything that they want and make the unprovable claim that a threshhold was detected and the account was made accessible and that they were looking for CSAM images as part of their process and "just happened to find" data exposing a government official of misconduct or whatever.
Very handy how the controls, both technical and social, are being dismantled here.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
More lies from the FAQ:
"Could governments force Apple to add non-CSAM images to the
hash list?Apple will refuse any such demands."
So Apple is claiming that they will break the law, and just go rogue and somehow not be coerced? Um, this is the same as saying all agreements and contracts and promises with them are void. If they are above the law, and the law makes their agreements with the public valid, Apple is announcing here that they are under no obligation to do anything that they have promised us. So all bets are off and they've said so.
"Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. "
This is known to be computationally impossible and they are clearly insulting the intelligence of their users to make such an impossible and ridiculous claim. It's impossible to make technology that can only scan for one thing in this manner and they know it, and so does anyone who reads this.
"We have faced demands to build and deploy government-man-dated changes that degrade the privacy of users before, and have steadfastly refused those
demands. We will continue to refuse them in the future."Sure, this is likely true. But have their also caved to demands? We will never know. This isn't something they, or we, can prove. We just have to take the words of someone bold faced lying to us says.
"Let us be clear, this technology is limit-ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. "
So our only protection is that liars might this one time be telling the truth?
TL:DR The government has some dirt on us, and in order to not let that cat out of the bag, we're selling you bitches out.
Thanks for all the fish, bye.
-
-
https://appleinsider.com/articles/21/08/17/germany-writes-to-tim-cook-to-reconsider-csam-plans
The quoted from the German government are priceless.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
https://appleinsider.com/articles/21/08/17/germany-writes-to-tim-cook-to-reconsider-csam-plans
The quoted from the German government are priceless.
That's actually a really good response from what appears to be a competent government.
-
Man from cover of Nevermind suing to classify the album as child pornography. So, does ownership of this album get your Apple flag limit up? Already major problems with this kind of scanning coming to light.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
Man from cover of Nevermind suing to classify the album as child pornography. So, does ownership of this album get your Apple flag limit up? Already major problems with this kind of scanning coming to light.
That sounds like scamming, but it's interesting to see that BBC cropped the image from the Nevermind album in the article. Meaning they chose to not show the entire image from the album cover.
However he also said that there were no model release signed. And that is a whole different matter because that means the record company didn't have the right to use the image on the album cover.
-
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed. But yes, he claims. If there was one, you'd think that there would be public pushback as that's a big deal given the circumstances.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed.
True. But pornography is not equal to nudity. No matter what the Puritanical fucking United States tries to say.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@jaredbusch said in Apple plans to scan your images for child porn:
@pete-s said in Apple plans to scan your images for child porn:
However he also said that there were no model release signed.
He claims. Not his parents. He has been making a fuss about it for years and nothing has ever come of it.
True. But as the issue is over pornography, no release can be valid even if signed.
True. But pornography is not equal to nudity. No matter what the Puritanical fucking United States tries to say.
But it's puritanical non-profits of unknown providence and no listed oversight who get to determine what is used by Apple. Apple itself has said it will apply no oversight and just blindly accept what some unlisted non-profits give it.
So considering that there is no standard even suggested to be applied by Apple, a simple claim like this is easily enough to meet their requirements given what Apple has publicly stated. It might not as well. We have no means of knowing as the requirements are not Apple's, but third party requirements, and any oversight of those is unlisted and unknown.
-
@pete-s said in Apple plans to scan your images for child porn:
That sounds like scamming,
You are not the only one that thinks that.
he was happy to dine out on having been Nirvana Baby until about 10 seconds ago. In 2016, in a 25th anniversary recreation of the photo shoot, he even volunteered to do it naked again, before thinking better of the idea and posing in swim trunks.