Apple plans to scan your images for child porn
-
@marcinozga said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
@scottalanmiller said in Apple plans to scan your images for child porn:
We just have to trust them on that, if that's even what they are saying. The concern is that they are putting something on your device that scans your data (any data, it has to scan everything to look for one thing) and then sometimes reports what it finds to the government.
So the concern is about Apple doing something that they have explicitly said they won't do rather than any concerns about what they are actually saying they are doing. You could get tinhat with every tech company in that case.
The concern is Apple is bringing 1984 into reality. This isn't happening on their devices (iCloud, Google photos, Facebook, etc.) anymore, your devices are being used as a surveillance tools against you.
If you don't see an issue in non-zero chance of you being locked up because some machine learning that can't tell the difference between a moon and a traffic light decided your pictures are child porn, then you're lost.
Government wanted backdoors in encryption, now they got one. From a company that championed end to end encryption and refused to create such backdoors in the past.talk about tinfoil hat!
This isn't anything new. They are simply TELLING you about it now. Hell since cellphone came out they have been tracking us anywhere and everywhere.
Is it bad - oh hell yeah it's bad - is there a damned thing we can do about it? Nope, not if you want to live in the modern world.
Scott seems like he's indicating he might be immune to this stuff because he's not in the USA - and of course, that's not true, not strickly speaking anyway. The gov't there would LOVE to do this as much as the US gov't does, they simply can't afford it.
But again, the original post is about Apple scanning to keep child porn off their platform/devices (I'm guessing they are now claiming you don't own your iPhone, you only rent it) otherwise I'm not sure of the legality of what they are doing...
-
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
Believe me - they'll get your consent on the next software update on page 3645 of the new EULA that almost no one reads.
-
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
Did you bother to read the article? If a match is found, it will be uploaded to Apple for manual verification. This alone should creep you out. Some random person looking at your pictures deciding if it's child porn or not. And then passed on to some non-profit setup by government. Zero transparency, no way to audit the whole process. What could possibly go wrong there?
-
I don't think so. I don't think Apple are too bad with privacy since their business model is still based on selling hardware rather than selling data. I trust them more than others.
If you don't use iCloud I don't think you have anything to be concerned about. They cannot access your phone.
-
@carnival-boy said in Apple plans to scan your images for child porn:
I don't think so. I don't think Apple are too bad with privacy since their business model is still based on selling hardware rather than selling data. I trust them more than others.
If you don't use iCloud I don't think you have anything to be concerned about. They cannot access your phone.
Again, read the article. You will get the software on the phone in iOS 15, and it will phone home. And unless you cut off internet access completely, there's not a thing you can do about it.
-
@marcinozga said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
Did you bother to read the article? If a match is found, it will be uploaded to Apple for manual verification. This alone should creep you out. Some random person looking at your pictures deciding if it's child porn or not. And then passed on to some non-profit setup by government. Zero transparency, no way to audit the whole process. What could possibly go wrong there?
I wonder why apple is doing this? why now?
Child porn is horrific - but damn.. any time anyone wants to trample on your rights, it's the first thing they trot out - We gotta save the children... /sigh - Is child porn really this prolific?
I say the same thing with gun violence.... more people still die from car crashes every single day - why isn't the gov't mandating driverless cars, the rate won't be zero, but would be significantly less than it is today. ok that was a tangent. -
@marcinozga said in Apple plans to scan your images for child porn:
Again, read the article. You will get the software on the phone in iOS 15, and it will phone home. And unless you cut off internet access completely, there's not a thing you can do about it.
Which bits. I don't find the article that clear. It initially suggests the data is uploaded automatically from the phone, but ends with these statements which clearly say it is only photos that are uploaded to iCloud that are affected:
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system
According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a “safety voucher” saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.
The article is also based on speculation and they haven't got Apple to comment.
-
@dashrender said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
@marcinozga said in Apple plans to scan your images for child porn:
And said software will upload results to Apple without your consent. And it just goes downhill from there.I don't think it will.
Did you bother to read the article? If a match is found, it will be uploaded to Apple for manual verification. This alone should creep you out. Some random person looking at your pictures deciding if it's child porn or not. And then passed on to some non-profit setup by government. Zero transparency, no way to audit the whole process. What could possibly go wrong there?
I wonder why apple is doing this? why now?
Child porn is horrific - but damn.. any time anyone wants to trample on your rights, it's the first thing they trot out - We gotta save the children... /sigh - Is child porn really this prolific?
I say the same thing with gun violence.... more people still die from car crashes every single day - why isn't the gov't mandating driverless cars, the rate won't be zero, but would be significantly less than it is today. ok that was a tangent.It always starts with children. And it's really not about that, because now that it's out, any pedophile that has 2 brain cells will simply stop using Apple devices.
-
And the Apple link clearly says it isn't doing what the article initially says will happen.
-
I haven't read the article.
that said - if what you lot are implying is true... then Apple basically just took a giant shit all over their Privacy first dance they've been doing for the past 5+ years.
-
On the other hand, if what I am implying is true.....
-
@marcinozga said in Apple plans to scan your images for child porn:
I bet 100% of parents have pictures of their naked children.
Definitely not. Including your child's genitals in a photo is a conscious decision you don't need to do.
We have photos of our children playing in the bathtub for example, but also made the conscious effort to not include their genitals in the photo. There's no reason to include that in the photo regardless of intentions.
-
@carnival-boy said in Apple plans to scan your images for child porn:
On the other hand, if what I am implying is true.....
Then still nothing changes. If what you imply is true, the risk we are worried about is still there.
What you are implying is that we blindly trust a private company AND all governments that have proven themselves to be untrustworthy and not working in the public's interest.
-
@dashrender said in Apple plans to scan your images for child porn:
I haven't read the article.
that said - if what you lot are implying is true... then Apple basically just took a giant shit all over their Privacy first dance they've been doing for the past 5+ years.
I don't think that there is an implication. The giant shit has already been taken. It's now done, their privacy game is over (for American customers.)
-
@dashrender said in Apple plans to scan your images for child porn:
I wonder why apple is doing this? why now?
Pressure from an authoritative regime concerned, as always, with curtailing freedom of speech without directly doing so in a way that gets the public to take action.
-
@carnival-boy said in Apple plans to scan your images for child porn:
If you don't use iCloud I don't think you have anything to be concerned about. They cannot access your phone.
The claim of the news release is that they are accessing the phones. That's the entire set of concern. If the entire thing is fake, then of course, it's fake and it's not a problem. The concern is not Apple scanning data that THEY host, it's scanning data that WE host.
-
@carnival-boy said in Apple plans to scan your images for child porn:
Which bits. I don't find the article that clear.
Very clear, first line: "Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans"
Installing software. On your device. Unless the sources are incorrect, this is crystal clear that they can access YOUR DATA and that the iCloud component is a red herring. This is the part that the researchers, and the people here, are concerned about.
This is like Pegasus. Sure, in theory it's used to stop terrorism. In the real world it is used to threaten reporters.
Anything that can scan your pictures can also steal them or plant new ones, at will. Will they? Sure, they say that they won't. But you cannot, ever, under any circumstances, trust a company that would do this to your phone. So that they cannot be trusted anymore, if the article is real, is a given.
-
According to the article: "The proposals are Apple’s attempt to find a compromise between its own promise to protect customers’ privacy...."
The article states that this is Apple compromising on the promise of privacy. Compromising on privacy is, obviously, the same as going back on it. If the article is correct, Apple has flat out decided that their privacy stance is no longer their policy (in the USA.)
-
@scottalanmiller said in Apple plans to scan your images for child porn:
@carnival-boy said in Apple plans to scan your images for child porn:
If you don't use iCloud I don't think you have anything to be concerned about. They cannot access your phone.
The claim of the news release is that they are accessing the phones. That's the entire set of concern. If the entire thing is fake, then of course, it's fake and it's not a problem. The concern is not Apple scanning data that THEY host, it's scanning data that WE host.
Not fake, confirmed by Apple, https://www.apple.com/child-safety/
These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*
and further
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
-
I was planning on getting an iPhone and this is definitely enough for me to go look at Xiaomi again who, by the way, has SO MUCH BETTER cameras anyway.