Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Unfortunately, verifiable privacy is not physically possible on MacBooks of today. Don't let a nice presentation fool you.

Apple Silicon has a Secure Enclave, but not a public SGX/TDX/SEV-style enclave for arbitrary code, so these claims are about OS hardening, not verifiable confidential execution.

It would be nice if it were possible. There's a lot of cool innovations possible beyond privacy.



I wrote a whole SDK for using SGX, it's cool tech. But in theory on Apple platforms you can get a long way without it. iOS already offers this capability and it works OK.

macOS has a strong enough security architecture that something like Darkbloom would have at least some credibility if there was a way to remotely attest a Mac's boot sequence and TCC configuration combined with key-to-DR binding. The OS sandbox can keep apps properly separated if the kernel is correct and unhacked. And Apple's systems are full of mitigations and roadblocks to simple exploitation. Would it be as good as a consumer SGX enclave? Not architecturally, but the usability is higher.


As if you get privacy with the inference providers available today? I have more trust in a randomly selected machine on a decentralized network not being compromised than in a centralized provider like OpenAI pinky promising not to read your chats.


Inference providers don't claim private inference. However, they must uphold certain security and legal compliances.

You have no guarantees over any random connected laptop connected across the world.


I would say the chances of OpenAI itself getting hacked and your secrets in logs getting leaked are about the same or less as the chances of a randomly selected machine on a decentralized network being reverse-engineered by a determined hacker. There's no risk-free option, every provider comes with risks. If you care about infosec you have to do frequent secret rotation anyway.


Every hardware key will be broken if there is enough incentive to do so. Their claims read like pure hubris.


Who cares about AI privacy? Most people don’t. If you do, run locally.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: