Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So, what does Apple get out of all this, except negative attention, erosion of their image, possible privacy lawsuits, etc?

I just don't understand what Apple's motivation would have been here. Surely this fallout could have been anticipated?



Is it like other stories I hear on HN where one guy or team is trying to get a promotion so keeps pushing their project? And people were afraid to oppose it? I’m baffled how this kept going up to implementation even in the birthplace of the reality distortion field.


Something this major, company brand changing, would have required a lot of executive sign-offs, presumably even Cook's personal blessing.


Cynical speculation:

Apple have decided their position of not being able to provide access to law enforcement is becoming a liability. They're probably under intense pressure from several governments on that front.

This is a way to intentionally let their hand be forced into scanning for arbitrary hashes on devices at the behest of governments, taking pressure off Apple and easing their relations with governments. They take a PR hit now, but it's not too bad since it's ostensibly about fighting child abuse, and Apple's heart is clearly in the right place. When later, inevitably, the hashes start to include other material, Apple can say their hands are tied on the matter - they can no longer use the "can't do it" defense and are forced to comply. This is much simpler than having to fight about it all the time.


My guess is that internally, they've realized they have a big CP problem on iCloud. That's a huge liability.


I have series doubts about that. CSAM really isn't an issue in the US, culturally and legally.


Really? It isn't an issue?


It is?


The FBI off their back that they aren’t doing enough to stop the spread of CP.


I find it hard to believe CSAM was so pervasive on iDevices that they'd feel compelled to do something about it.

As far as we know (and I'm sure lots of eyeballs are looking now) Android doesn't do this.

And frankly, why would Apple care that the FBI isn't cozy with them. Their entire brand is "security and privacy", kind of goes against most 3 Letter Agencies anyway.


At least according to [1]:

"Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users."

If you were a law enforcement agency and noticed this discrepancy, would you believe that you'd be letting some number of child abusers get away because of that difference in 20 million reports? iCloud probably doesn't have the same level of adoption as Facebook, but the gap is still very large.

[1] https://www.nytimes.com/2021/08/05/technology/apple-iphones-...


I agree CSAM isn't likely to be pervasive in the photo libraries on iOS devices.

Android does not do on-device scanning, but Google does scan photos after they are uploaded to their cloud photo service. It's not on-device scanning, but the effect is functionally identical: photos that are being uploaded to the cloud are being scanned for CSAM. The only real distinction is who owns the CPU which computes the hash.

I doubt it's the FBI pressuring Apple. My suspicion is it's fear of the US Congress passing worse, even more privacy-invading laws under the guise of combating CSAM. If Apple's lobbyists can show that iPhones are already searching for CSAM, arguments for such laws get weaker.


> Android doesn't do on-device scanning, but Google does scan photos after they are uploaded to their cloud photo service.

So did Apple, and pretty much all cloud hosting providers.

This, on device, scanning is what's new, and very out of character for Apple.

> If Apple's lobbyists can show that iPhones are already searching for CSAM, arguments for such laws get weaker.

I'm not aware of any big anti-CSAM push being made by Congress. CSAM just isn't really a big issue in the US, the existing laws, and culture, are pretty effective already.


CSAM can never be a policy issue with two sides, because everyone is in agreement that we need to protect children. The higher powers want to prevent child abuse, and CSAM is directly tied to child abuse. When people argue that "think of the children" can be weaponized to attack their freedoms, they wouldn't dare try to argue against the premise that children are harmed because of CSAM - not because the arguments will fall on the deaf ears of some governmental agents trying to push an agenda, but because the premise itself is sound.

As a result, people will focus their arguments instead on the technological flaws in the current implementation of on-device scanning or slippery slope arguments that are unlikely to become reality, the feature will be added anyway with no political opposition, and in the end Apple and/or the government will get what they want, for what they consider the greater good.

I think that absolute privacy in society as a whole isn't attainable with those values in place, and it raises many questions regarding to what extent the Internet should remain free from moderation. Are there really no kinds of information that are so fundamentally damaging that they should not be allowed to exist on someone's hard drive? If not, who will be in control of moderating that information? Maybe we will have to accept that some tradeoffs between privacy and stability need to be made for the collective good, in limited circumstances.


There is a lower limit to privacy (as a human right) – which after passing, societies would seize to be "free" (liberal democracies?). But that's not a discussion people seem to want to have, when talking about their good intentions of fighting against horrible things.


> I'm not aware of any big anti-CSAM push being made by Congress.

Right now. The best time for Apple to do this is when cannot be painted as a defensive move against any specific legislation. The CSAM argument has been used many times in the past and it's certain to be used many more times in the future.


Apple did not scan uploaded images. Apple has never scanned iPhone images. Last year Apple reported 245 cases to missing and exploited children, and Facebook reported 50M, Google ~4M, Microsoft reported, and Apple was below the line.

https://www.nytimes.com/2020/02/07/us/online-child-sexual-ab...


We do know that Apple has been scanning email attachments sent via iCloud email. I don't think it's ever been claimed that Apple has ever scanned anyone's iCloud Photo Library.

Ethics aside, on-device scanning has the benefit of Constitutional protection, at least in the USA. Because the searching is being performed on private property, any attempt by the Government to try to expand the scope of searches would be a clear-cut 4th Amendment violation.

(Whereas if the scanning is done in the cloud, Government can compel searches and that would fall under the "third party doctrine" which is an end-run around the 4th Amendment.)


> Because the searching is being performed on private property,

It's it though? The device someone bought 2 years ago suddenly starts reporting them to the FBI's Anti-CSAM unit without the owners realistic consent does seem like a run-around to unpermissioned government searches. It's not reasonable to say "throw away your $1200 device if you don't consent", is it? Nor can a person reasonably avoid iOS updates that force this feature to be active.

> any attempt by the Government to try to expand the scope of searches

We've seen private companies willfully censor individuals at the government's behest under the current administration - will Apple begin expanding the search and reporting mechanisms just to stay in whatever administration's good grace?

Like I said, this is extremely out of character, and very off-brand for Apple. Why would someone trust Apple going forward? Even Google's Android doesn't snitch on it's owners to law enforcement... Setting aside all the ways for nefarious actors to abuse this system and sic LE on innocent individuals.


> It's it though?

Yes. Your phone is your private property, just like your house or your car. Searching your private property requires a warrant or reasonable suspicion, otherwise it's a 4th Amendment violation.

This twitter thread is worth a read.

https://twitter.com/pwnallthethings/status/14248736290037022...


The 4th amendment only protects you from the government searching your property. Otherwise, the Microsoft telemetry which reports back to Microsoft what software you have installed and what apps you are running would be illegal.


So, what does a person do if they do not consent to this search? Tough?

You can't realistically avoid the iOS update. Apple has effectively given consent on your behalf... How will that fly?


If you do not consent to having your photos scanned for CSAM, turn off iCloud Photo Library. Same as how you opt out of CSAM scanning of your photo library on Android.

If you're concerned about other forms of scanning compelled by the Government, you never consented to the search. So even if Apple complied, the search is invalid and cannot be used to prosecute you.


> If you're concerned about other forms of scanning, you didn't consent—so even if Apple complied, the search is invalid and cannot be used to prosecute you.

This is a dangerously false understanding of the law. Stop giving legal advice. You are not a lawyer.


Are you saying that if the US Government compelled Apple to scans millions of citizen's private property for non-CSAM images, this would not be a clear-cut violation of the Fourth Amendment?

I'm curious, do you think that the Third Party Doctrine applies here?


I think the realistic danger here is the US Government no longer needs to compel this type of activity. Reference Twitter and Facebook/Instagram voluntarily censorship per mere suggestion of the current administration/power party.


Let's be practical for a minute. What specific image would Apple voluntarily search for on behalf of the US Government? I sincerely can't think of anything.


Images, leaked government files, anti-administration phrases, unflattering memes of the president, statements that contradict the government's current stance, etc.

All things current at social media companies seem willing to censor after suggestion of the administration.


You seriously think Apple would voluntarily search private devices for images which aren't illegal and don't even hint at any action which is illegal?

I don't think you're being serious.


Why not? Facebook and Twitter have done exactly that in the past year. Why is it far fetched for Apple suddenly, given this amazing reverse-course on branding?

The only realistic alternative to Apple is Android... And Google is pretty darn transparent in their spying on users. Apple just did a 180 degree about-face on all the branding they've built over the last decade. Why should anyone trust Apple again?

Look, this whole neural-hash thing took what, 2 weeks for people to fabricate collisions? This just illustrated how poorly conceived and ill-thought the entire plan was from Apple. It's not beyond reason to assume any of these things given the evidence we currently have.


It does not appear one can opt out certain folders from this scan. If you enable iCloud backups, it's scans the entire shebang.

As previously mentioned, Android doesn't scan all photos on your device... Google scans content uploaded to their servers. Which is reasonable... It's their servers, they can host what they want. Your iPhone is your iPhone.


> If you enable iCloud backups, it's scans the entire shebang.

Citation?


Do I need one? Where in iOS can you choose which folders to opt-into CSAM scanning? I only see an all-or-nothing option for iCloud photos.


Yes, you do need a citation, because I've not heard Apple (or anyone else) claim that iCloud Backups or iCloud Drive are being subject to CSAM scans.

From everything I've read, from Apple and other sources, if the photo is about to be uploaded to iCloud Photo Library then it is scanned for CSAM. If it's not, it isn't.


How does one choose individual photos to not upload?


You store the photos you want to keep private in another app. I'm sure there are lots in the App Store.

Still waiting on that citation.


If the default behavior is not to exclude photo rolls from this new feature, I'm not sure where the argument exists. Telling iOS users they should download some app to keep photos private is absurd.


If a photo is about to be uploaded to iCloud Photo Library then it is scanned for CSAM. If it's not, it isn't.

Still waiting on that citation.


Are we arguing the same thing? How does one opt-out a specific photo? It's not possible as far as I know.


I've no idea what your point is. I've tried offering answers for all these random questions, but I'm still waiting for you to offer a citation for the claim you made earlier.


> I agree CSAM isn't likely to be pervasive in the photo libraries on iOS devices.

Where does this assumption come from? Because of iOS lower market share? Are you implying they are more prevalent in Android devices? In desktop computers? I don't understand the logic.


The assumption comes from a general observation that most normal people don't tend to use their photo library to store legal porn (other than home made) and I haven't seen any argument for why CSAM aficionados are expected to be any less careful. There are surely plenty of apps out there for keeping separately encrypted vaults of files/photos, and I'm sure many are very easy to use.

You don't have to be particularly tech savvy to know it's a bad idea to co-mingle your deepest darkest secrets alongside photos of your mum and last night's dinner. Especially when discovering those secrets would lead to estrangement, or prison.

As for the few who might be doing it currently, that's likely to plummet quickly. If you think Apple's move caused waves in the Hacker News crowd, just imagine how much it has blown up in the CSAM community right now. I dare say it's probably all they've been talking about for the past two weeks.


Yeah, they’re probably all saying “well I know for sure I’ll never use an Apple device for my CP from now on!” From Apple’s point of view, that’s mission accomplished.


It is more believable they are introducing this tech for larger international security reasons still kept under wraps.


Possible avoidance of being told how they have to do it later.

They will have to do it either way, and they the fact they are even telling how us they plan to do it is more than we can say for every other cloud services.

This is better than all alternatives at this point. Like it or not. If you don't like, you might need to get up to speed on what other services you may already be using are doing.


Government coercion.


They had to have been coerced.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: