Apple plans to scan your images for child porn
-
Looks like Apple has gone fully off of the rails. Full on spying on their workers at home...
-
@obsolesce said in Apple plans to scan your images for child porn:
To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.
Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.
The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
Looks like Apple has gone fully off of the rails. Full on spying on their workers at home...
Not surprised - this is the exact mentality that some had around here... If I can't see you working/not working, then I assume you're not working. /sigh.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
The issue is that they are scanning without the upload, and can based on that be forced to report on you.
It seems like this whole thing is only put into action via a trigger, which is the uploading of the image to your iCloud account. Before then, there's no way for any reporting. The reporting is done via iCloud servers based on the image voucher that's with your image in iCloud.
This whole thing is all about the iCloud Photos account accumulating enough CSAM matching vouchers. Without your photos being uploaded to your iCloud Photos account, this whole thing is moot.
For reference: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
-
This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.
-
@itivan80 said in Apple plans to scan your images for child porn:
This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.
I agree. And also why not scan for every other type of potential crime? Drug trafficking, terrorism, murder, war crimes etc.
Maybe apple should record all your conversations with the built-in mic in your device to keep track of what you're saying - without you knowing and without a warrant of course. Oh, I forgot, they were already caught doing that with Siri two years ago.
-
@pete-s said in Apple plans to scan your images for child porn:
@itivan80 said in Apple plans to scan your images for child porn:
This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.
I agree. And also why not scan for every other type of potential crime? Drug trafficking, terrorism, murder, war crimes etc.
Maybe apple should record all your conversations with the built-in mic in your device to keep track of what you're saying - without you knowing and without a warrant of course. Oh, I forgot, they were already caught doing that with Siri two years ago.
This is Scott's entire point...
And this child porn is literally just the foot in the door - tomorrow they WILL be searching for those things because warrants will make them.
-
@itivan80 said in Apple plans to scan your images for child porn:
This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.
No, because without the backdoor breaking the chain of encryption, nothing is there for a warrant.....
-
@itivan80 said in Apple plans to scan your images for child porn:
This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.
Well in a way, they are actually now private unlicensed police.
-
-
@stacksofplates said in Apple plans to scan your images for child porn:
Also, if you look at their diagram in their white paper, the photo is part of the safety voucher, which is what is uploaded to iCloud.
So this is what I was getting at earlier.
This voucher is uploaded to iCloud Photos along with the image.
Is that separate from icloud backup or is the voucher sent along with the image when it's backed up? By their process description the photo has to be sent as well, because they can't verify other-wards.
This is why it's not straightforward and why I think @Carnival-Boy was making those statements.
It looks like it's only a single package that goes to iCloud. Either you choose to back up your photo to iCloud and in that case it's packaged with the voucher..... or NOTHING happens at all. The photo is not sent to iCloud and no scanning or anything happens.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.
Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.
The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.
Another confirmation:
-
@obsolesce if it was only on upload why put the signatures on every phone.
-
@jaredbusch said in Apple plans to scan your images for child porn:
@obsolesce if it was only on upload why put the signatures on every phone.
They need to be on the device if photos are going to be uploaded to iCloud Photos since I assume photo uploads are the behavior. I'm sure there are tons of things on these devices that aren't ever used. I doubt the signatures being on the phone matters if it's not used, that'll likely be the minority.
-
@obsolesce said in Apple plans to scan your images for child porn:
I doubt the signatures being on the phone matters if it's not used, that'll likely be the minority.
They are a problem because they can be used. The gov't can force the upload with a warrant. So the existence of the capability is the exposure itself.
It's true, semantically we can say that there is no risk without an upload (so a phone that is dead, for example, is not affected.) But as the gov't or Apple as an organization has the power to enact the upload at will the problem exists before that point.
Similarly we could say that the upload doesn't matter because we aren't exposed until the data from the upload is handed to the gov't.
Or we could say that the gov't stealing our data doesn't matter until they use it maliciously.
Or we could say that being arrested and taken to court for pictures that Apple claims to have found on our phones, or the police claim to have gotten from Apple, doesn't matter until we end up in jail for something we didn't do.
Any step taking away our freedom of speech, any step threatening journalists or minorities can be seen as irrelevant until it is fully used to hurt someone. But that's not the case. The power and opportunity itself is a threat and threats are primarily what stop freedoms.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
It's true, semantically we can say that there is no risk without an upload (so a phone that is dead, for example, is not affected.) But as the gov't or Apple as an organization has the power to enact the upload at will the problem exists before that point.
How do they force photos stored on your phone to upload to iCloud? I have OneDrive on my phone, and if I have my photos or the photos folder set to not upload to OneDrive, it doesn't. If Microsoft of the government wanted to force my photos to upload, I don't see how they could do that. I mean, I suppose the government or Apple could always do something to force photo uploads to anywhere they want through an OS update, regardless of anything at all... so I fail to see how this enables something that they could do at any time without it anyways?
-
@obsolesce said in Apple plans to scan your images for child porn:
How do they force photos stored on your phone to upload to iCloud?
They are adding a tool to the OS that detects what is on your phone. Apple has 100% power, as long as you have an Internet connection, to use that to trigger any upload. From a software standpoint, this is absolutely trivial to do.
-
@obsolesce said in Apple plans to scan your images for child porn:
If Microsoft of the government wanted to force my photos to upload, I don't see how they could do that.
They definitely can. But MS doesn't scan the contents to know what to upload. So getting a warrant to force a blind upload, to then hope to find evidence of something is a very different matter. But from a technology standpoint, trivial. But legally it's protected in ways Apple's product is designed not to be.
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@obsolesce said in Apple plans to scan your images for child porn:
How do they force photos stored on your phone to upload to iCloud?
They are adding a tool to the OS that detects what is on your phone. Apple has 100% power, as long as you have an Internet connection, to use that to trigger any upload. From a software standpoint, this is absolutely trivial to do.
That tool does not trigger any uploads. That tool itself is what is triggered, not the other way around. If you upload a photo to iCloud, then the tool is triggered to scan the photo to package it for the upload to iCloud.
-
@obsolesce said in Apple plans to scan your images for child porn:
I suppose the government or Apple could always do something to force photo uploads to anywhere they want through an OS update, regardless of anything at all... so I fail to see how this enables something that they could do at any time without it anyways?
This is a tool that tells the government when it wants an upload triggered. It literally changes everything. The ability to trigger an upload is universal in closed source or blind updated operating systems. That's a pre-existing condition that we can't avoid.
Apple is, for the first time, building a tool that can alert the government when to force it to happen to make sure that they get results that they want.