Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem I see with differential privacy is this: One part of the public doesn't care about privacy enough to demand things like that, the other part wouldn't trust the math and the implementation behind it.

I mean, I consider myself moderately knowledgeable about statistics, but even I have problems understanding DP. Worse, scientists who are supposed to use it will also have a harder time understanding DP over their usual methods.



I mean it's nothing to do really with the math. It's the fact that you have to send the real data to an untrusted 3rd party and rely on their word that they will anonymize your data.

And if a situation arises where a manager at Google has to make the decision to 'slightly' reduce the effectiveness of differential privacy because they need a certain metric for a report do you really think they're going to make the principled choice?


> the other part wouldn't trust the math and the implementation behind it.

I trust the math and method just fine, but I'm in the "won't trust the implementation" group. Ad companies like Google have demonstrated they can't be trusted too many times for me to think that they'll do DP in a way that goes against their business interest.


The same can be said for end-to-end encryption...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: