Apple plans to scan your images for child porn
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.
Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.
The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.
Another confirmation:
-
@obsolesce if it was only on upload why put the signatures on every phone.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@obsolesce if it was only on upload why put the signatures on every phone.
They need to be on the device if photos are going to be uploaded to iCloud Photos since I assume photo uploads are the behavior. I'm sure there are tons of things on these devices that aren't ever used. I doubt the signatures being on the phone matters if it's not used, that'll likely be the minority.
-
@obsolesce said in Apple plans to scan your images for child porn:
I doubt the signatures being on the phone matters if it's not used, that'll likely be the minority.
They are a problem because they can be used. The gov't can force the upload with a warrant. So the existence of the capability is the exposure itself.
It's true, semantically we can say that there is no risk without an upload (so a phone that is dead, for example, is not affected.) But as the gov't or Apple as an organization has the power to enact the upload at will the problem exists before that point.
Similarly we could say that the upload doesn't matter because we aren't exposed until the data from the upload is handed to the gov't.
Or we could say that the gov't stealing our data doesn't matter until they use it maliciously.
Or we could say that being arrested and taken to court for pictures that Apple claims to have found on our phones, or the police claim to have gotten from Apple, doesn't matter until we end up in jail for something we didn't do.
Any step taking away our freedom of speech, any step threatening journalists or minorities can be seen as irrelevant until it is fully used to hurt someone. But that's not the case. The power and opportunity itself is a threat and threats are primarily what stop freedoms.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
It's true, semantically we can say that there is no risk without an upload (so a phone that is dead, for example, is not affected.) But as the gov't or Apple as an organization has the power to enact the upload at will the problem exists before that point.
How do they force photos stored on your phone to upload to iCloud? I have OneDrive on my phone, and if I have my photos or the photos folder set to not upload to OneDrive, it doesn't. If Microsoft of the government wanted to force my photos to upload, I don't see how they could do that. I mean, I suppose the government or Apple could always do something to force photo uploads to anywhere they want through an OS update, regardless of anything at all... so I fail to see how this enables something that they could do at any time without it anyways?
-
@obsolesce said in Apple plans to scan your images for child porn:
How do they force photos stored on your phone to upload to iCloud?
They are adding a tool to the OS that detects what is on your phone. Apple has 100% power, as long as you have an Internet connection, to use that to trigger any upload. From a software standpoint, this is absolutely trivial to do.
-
@obsolesce said in Apple plans to scan your images for child porn:
If Microsoft of the government wanted to force my photos to upload, I don't see how they could do that.
They definitely can. But MS doesn't scan the contents to know what to upload. So getting a warrant to force a blind upload, to then hope to find evidence of something is a very different matter. But from a technology standpoint, trivial. But legally it's protected in ways Apple's product is designed not to be.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
How do they force photos stored on your phone to upload to iCloud?
They are adding a tool to the OS that detects what is on your phone. Apple has 100% power, as long as you have an Internet connection, to use that to trigger any upload. From a software standpoint, this is absolutely trivial to do.
That tool does not trigger any uploads. That tool itself is what is triggered, not the other way around. If you upload a photo to iCloud, then the tool is triggered to scan the photo to package it for the upload to iCloud.
-
@obsolesce said in Apple plans to scan your images for child porn:
I suppose the government or Apple could always do something to force photo uploads to anywhere they want through an OS update, regardless of anything at all... so I fail to see how this enables something that they could do at any time without it anyways?
This is a tool that tells the government when it wants an upload triggered. It literally changes everything. The ability to trigger an upload is universal in closed source or blind updated operating systems. That's a pre-existing condition that we can't avoid.
Apple is, for the first time, building a tool that can alert the government when to force it to happen to make sure that they get results that they want.
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
How do they force photos stored on your phone to upload to iCloud?
They are adding a tool to the OS that detects what is on your phone. Apple has 100% power, as long as you have an Internet connection, to use that to trigger any upload. From a software standpoint, this is absolutely trivial to do.
That tool does not trigger any uploads. That tool itself is what is triggered, not the other way around. If you upload a photo to iCloud, then the tool is triggered to scan the photo to package it for the upload to iCloud.
That's incorrect and every source agrees. The tool does the scanning ON THE DEVICE. Apple promises to only look at data uploaded to iCloud. But Apple themselves have been crystal clear that scanning happens regardless of an iCloud account or any uploads. Scanning is universal, automatic, and unstoppable even if you don't have an Internet connection.
Carnival made up the bit about it only scanning things that were uploaded in his irrational attempt to defend Apple's behaviour. Not even Apple provided a hint of that being true, which we found over and over again. And every legal source, like the EFF, points out that it is the on device scanning that we are concerned about and discussing. Absolutely no one is concerned (significantly) about the scanning of iCloud, which Apple hasn't said that it will do. It simply uses iCloud as the mechanism to harvest the data after the scanning has identified it.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
This is a tool that tells the government when it wants an upload triggered.
No this is not how it works at all. I'm saying it's the opposite. YOU do the upload. The tool only scans and packages what you are already uploading.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
The tool does the scanning ON THE DEVICE.
Yes, that's true. I never said the tool doesn't scan on the device. The tool scans the photo on the device to match CSAN signatures. That has nothing at all to do with the tool itself uploading anything. It only does the scan as part of YOUR upload to iCloud photos.
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
This is a tool that tells the government when it wants an upload triggered.
No this is not how it works at all. I'm saying it's the opposite. YOU do the upload. The tool only scans and packages what you are already uploading.
You can say that, but there's nothing to make that true. The scanning happens separate from any upload. The results of the scan get shared to Apple via the upload. The fact that right now uploads are triggered by the end user (or automated as actually happens) is the same as the government being able to force it. Anything the end user can do on an online connected device the government can coerce with a warrant.
So saying "you upload it" and "the government can collect it at will" are synonymous.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
This is a tool that tells the government when it wants an upload triggered.
No this is not how it works at all. I'm saying it's the opposite. YOU do the upload. The tool only scans and packages what you are already uploading.
You can say that, but there's nothing to make that true. The scanning happens separate from any upload. The results of the scan get shared to Apple via the upload. The fact that right now uploads are triggered by the end user (or automated as actually happens) is the same as the government being able to force it. Anything the end user can do on an online connected device the government can coerce with a warrant.
So saying "you upload it" and "the government can collect it at will" are synonymous.
Did you even read the whitepaper?
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
The tool does the scanning ON THE DEVICE.
Yes, that's true. I never said the tool doesn't scan on the device. The tool scans the photo on the device to match CSAN signatures. That has nothing at all to do with the tool itself uploading anything. It only does the scan as part of YOUR upload to iCloud photos.
You keep saying "your". All uploads are "your" uploads, no matter who triggers them. The OS, Apple, the gov't. It's all ignoring the issue that it scans your data and reports on it to the government and none of Apple's claimed protections are possible.
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
This is a tool that tells the government when it wants an upload triggered.
No this is not how it works at all. I'm saying it's the opposite. YOU do the upload. The tool only scans and packages what you are already uploading.
You can say that, but there's nothing to make that true. The scanning happens separate from any upload. The results of the scan get shared to Apple via the upload. The fact that right now uploads are triggered by the end user (or automated as actually happens) is the same as the government being able to force it. Anything the end user can do on an online connected device the government can coerce with a warrant.
So saying "you upload it" and "the government can collect it at will" are synonymous.
Did you even read the whitepaper?
Yup. So did the EFF. And that's why we both have the same conclusion.
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
This is a tool that tells the government when it wants an upload triggered.
No this is not how it works at all. I'm saying it's the opposite. YOU do the upload. The tool only scans and packages what you are already uploading.
You can say that, but there's nothing to make that true. The scanning happens separate from any upload. The results of the scan get shared to Apple via the upload. The fact that right now uploads are triggered by the end user (or automated as actually happens) is the same as the government being able to force it. Anything the end user can do on an online connected device the government can coerce with a warrant.
So saying "you upload it" and "the government can collect it at will" are synonymous.
Did you even read the whitepaper?
Apple's details are SUPER clear that the scanning is on device, and there is no possibility of them protecting against a warrant forcing an upload. There's no mechanism to stop that. So now that they can provide your data to the government, the government now has the legal ability to force it. The one creates the other.
-
In other situations, like Telegram or Signal, data is fully encrypted in a way that the vendor has no access to it, no possible access to it. That's the trick to stop the government and that's why they do that. By not giving themselves access, they can't be forced to act maliciously for an authoritarian regime to scare, threaten, or attack political dissidents.
But Apple is leaving out this protection and making it that they can be forced to gather information. Anything about what they "won't" do is false, as those are statements that Apple cannot make. They don't have the legal right to make those statements. Claiming that they will break the law if asked to obey a warrant is an automatically false contract because contracts cannot be written that claim to deny the law. So Apple knows that they can claim it, because by doing so, they have no obligations to do so.
-
@obsolesce said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.
Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.
The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.
Another confirmation:
Nothing that this is known to be false. The statement, which defies the official statement from Apple, is a lie because in order to do what they say they have to scan to know that data is from those sources. And notice it is "set to be" uploaded, not uploaded or uploading. This is a pretty ambiguous phrase that results in "can scan anytime."
The second part is "only for certain files." You have to scan everything to find something that you are looking for. It's physically impossible to only search files that are of a certain list because you can't know that they are on that list until after having already scanned them.
So whatever source you have for this was already stated as being false previously. That all files must be scanned has been said since the first moments of this discussion.
Whoever posted this clearly thought that people were pretty gullible and wouldn't think about this at all. That people are trying to defend Apple with such obvious lies makes it so much more obvious how bad it is and how bad they all know it is.
-
Any claim from Apple that says that they "won't" do something demanded by a government that they "can" do is a lie. Apple has no power to say no to that. It's not within their power, the government can seize the entire company if they want. Denying that kind of action is like saying you won't die if hit by a bullet, you promise.
Apple is basically resorting to just saying any crap hoping that their fanboy culture will protect them. But it isn't working. People are pointing out that it's obviously false and has no possibility of being true. Even Apple's own teams are speaking out now.
You can't use claims. If you want to say that there is some protection, you have to show how there is a technology that makes it impossible for Apple to do these things. If such a thing existed, Apple would be touting it all over the place. They are not, because there is not. Apple isn't claiming that they can't be forced to do these things, just that they will deny the government somehow which is a claim anyone can make, but no one can make honestly.