Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is weird. Apple's own announcement only talks about hash matching, but other reporting (e.g., [0]) talks about a system called 'neuralMatch' that's doing AI on user photos. To me, the privacy implications (and chance of false positives) seems quite different. Quite a discrepancy.

[0] https://www.zerohedge.com/technology/apple-plans-monitor-all...



No, Apple's announcement talks about machine learning to power the iMessage nudity detection algorithm. Machine learning in the context of ruining someone's life (and/or landing them on a government watchlist forever) is a huge no in my book.


Ah, you're right, it does go into that later.

The reporting isn't helping, with sentences like 'the company is rolling out a new machine-learning tool that will scan iPhones for images that match certain "perceptual hashes"'... is that describing a system that classifies new photos, or is it talking about comparing hashes to a known-evil set?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: