> Especially because child porn isn't not the initial but followup crime. The initial one is the actual abuse which isn't prevented.
>> You are assuming that all the abuse would still occur and none of it is encouraged by the demand for that content. I don't know why we should make that assumption.
There's near universal agreement that child abuse is wrong and should be stopped. Profiting off of child abuse is even more wrong. But how much CSAM detected on phones is going to be because the user engaged in commerce with CSAM?
I think you'll have a few different cohorts caught by this process, but almost none of them will be producers/resellers/purchasers of CSAM content.
I'd bet most of the CSAM detected will be from the occasional legal case that stems from teenagers sending explicit content to each other. Once that content is registered, any classmates who downloaded photos from group texts and backed them up to icloud without thinking about will be raided and prosecuted.
Another group will be people who had CSAM planted maliciously, because this creates an easy vector for swatting if you can get photos onto someone's device. Not to mention the adversarial network attacks that have been on HN in the last few days about this.
Also, because we can't inspect the CSAM database, it's quite likely that some percentage of what people thought was adult porn was actually underage. I see nude photos shared on forums all the time. I'm sure some people save those photos to their devices, and it's unknowable if the busty teen just an 18 year old that felt like sharing or revenge porn content from a 17 year old's CSAM case. Surely some porn addicts that have been saving every image they found attractive for years will get caught up in this.
Even the group ostensibly being targeted by this technology is unlikely to help prevent actual child abuse. For example, someone might join a porn Telegram channel, be exposed to CSAM with or without realizing that's what it is, and save it to their device. People caught in this group probably do need some kind of mental help, which the criminal system will not provide. Piracy may create some demand, but it's still pretty far removed from the abuse. Out of this group, an even smaller fraction may have actually paid for CSAM material, which is the first group where you'd find unanimous support for law enforcement action.
The people we all want to see locked up, those that actually abusing children, or those profiting/funding the abuse of children, are unlikely to be caught in this type of dragnet.
I'd imagine that when Apple turns this feature on, it will immediately catch thousands of people in the dragnet, and almost none of them will be involved in the production or sale of CSAM.
> I'm not saying Apple is definitively involved in some shady stuff, but from my perspective, it does look like NSA forced them to do some sort of file scanning backdoor and they came up with this "it's about saving the children" explanation, already successfully in use in oppressive countries.
It does seem that way because it's just so hard to draw a line from "apple scans all your images to match previously known child abuse" to "child abuse is prevented".
I think the absolute best outcome we can hope for here is that some people who purchased CSAM, essentially low-level users, will be discovered and prosecuted. Maybe that's a good thing, and perhaps it will remove thousands or even tens of thousands of dollars from the underground economy. But it hardly seems worth the cost to privacy and security if we never stop any actual child abuse.
The famous porn actress Traci Lords was 13 when she got started in the business. She has talked about what happened back then, and the fact that well known actors like John Holmes knew about it and didn’t care. They just gave her more drugs to keep her quiet while they did their business.
So, anyone with those videos or explicit pictures of her that were taken before she officially turned 18, would be guilty of having kiddie porn on their computer, or otherwise on their premises.
It’s gotten better since then, but the industry is still pretty seriously messed up.
Where the crime becomes even greater is when the “ignorance is no excuse” policy is used to harass and even convict innocent people for crimes they could not possibly have known that were committed by other people.
>> You are assuming that all the abuse would still occur and none of it is encouraged by the demand for that content. I don't know why we should make that assumption.
There's near universal agreement that child abuse is wrong and should be stopped. Profiting off of child abuse is even more wrong. But how much CSAM detected on phones is going to be because the user engaged in commerce with CSAM?
I think you'll have a few different cohorts caught by this process, but almost none of them will be producers/resellers/purchasers of CSAM content.
I'd bet most of the CSAM detected will be from the occasional legal case that stems from teenagers sending explicit content to each other. Once that content is registered, any classmates who downloaded photos from group texts and backed them up to icloud without thinking about will be raided and prosecuted.
Another group will be people who had CSAM planted maliciously, because this creates an easy vector for swatting if you can get photos onto someone's device. Not to mention the adversarial network attacks that have been on HN in the last few days about this.
Also, because we can't inspect the CSAM database, it's quite likely that some percentage of what people thought was adult porn was actually underage. I see nude photos shared on forums all the time. I'm sure some people save those photos to their devices, and it's unknowable if the busty teen just an 18 year old that felt like sharing or revenge porn content from a 17 year old's CSAM case. Surely some porn addicts that have been saving every image they found attractive for years will get caught up in this.
Even the group ostensibly being targeted by this technology is unlikely to help prevent actual child abuse. For example, someone might join a porn Telegram channel, be exposed to CSAM with or without realizing that's what it is, and save it to their device. People caught in this group probably do need some kind of mental help, which the criminal system will not provide. Piracy may create some demand, but it's still pretty far removed from the abuse. Out of this group, an even smaller fraction may have actually paid for CSAM material, which is the first group where you'd find unanimous support for law enforcement action.
The people we all want to see locked up, those that actually abusing children, or those profiting/funding the abuse of children, are unlikely to be caught in this type of dragnet.
I'd imagine that when Apple turns this feature on, it will immediately catch thousands of people in the dragnet, and almost none of them will be involved in the production or sale of CSAM.
> I'm not saying Apple is definitively involved in some shady stuff, but from my perspective, it does look like NSA forced them to do some sort of file scanning backdoor and they came up with this "it's about saving the children" explanation, already successfully in use in oppressive countries.
It does seem that way because it's just so hard to draw a line from "apple scans all your images to match previously known child abuse" to "child abuse is prevented".
I think the absolute best outcome we can hope for here is that some people who purchased CSAM, essentially low-level users, will be discovered and prosecuted. Maybe that's a good thing, and perhaps it will remove thousands or even tens of thousands of dollars from the underground economy. But it hardly seems worth the cost to privacy and security if we never stop any actual child abuse.