Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Snowden: Dropbox is hostile to privacy, unlike 'zero knowledge' Spideroak" - http://www.theguardian.com/technology/2014/jul/17/edward-sno...

and from https://firstlook.org/theintercept/document/2014/07/14/jtrig... under Honeypots we have, LONGSHOT - file-sharing and upload website.

And Condoleezza Rice just got on their board. I haven't seen anything that makes an official link between those but at least I moved off Dropbox :-P



It scared the crap out of me when Dropbox asked if it wanted to save my screenshot. Since when did it have access to things like that? I had a phone interview with Dropbox a few weeks ago and they mentioned a ton of new products that seemed vaguely offputting as well. After the screenshot prompt, I immediately uninstalled Dropbox, but after a few days I realized it had a bunch of useful backups so I ended up reinstalling it.


If the dropbox program is running under your user account, it has the same permissions you do. Unless you're on Linux and have restricted it via AppArmor.


Even with AppArmor in place, it still has access to read every key you press in any other X11 app. Here's how I lock down Dropbox on my system:

https://grepular.com/Protecting_Your_GNU_Linux_System_from_D...


Isn't that behavior -- asking to do something before it does it -- the behavior that you'd like? I would be upset if it went ahead and did that, and then I discovered the feature later.


Maybe if I wanted to save my screen shots. I didn't want that behavior at all and it surprised me that it had access things outside of the well-defined folder I had already known about.


How can we trust SpiderOak when their client is not open sourced?


I think the more important question is if their client is open sourced, how can we trust that the binaries are made with exactly those sources? There should be a measure of testing it without the sources too, maybe sniffing packets to check for the encrypted content is a start.


The whole point of being open sourced is that you can compile them yourselves after your verification and use the product of your own compilation with the server-side of spideroak.

More realistically, your distribution maintainers will verify and compile the package, and you trust your distribution maintainers more than spideroak so you delegate source verification to them.


> The whole point of being open sourced is that you can compile them yourselves after your verification and use the product of your own compilation with the server-side of spideroak.

So this is a good point, however:

> More realistically, your distribution maintainers will verify and compile the package, and you trust your distribution maintainers more than spideroak so you delegate source verification to them.

This is actually pretty compelling, but ignores Windows and OSX devices, which not only are the vast majority of all users, but also the least likely to compile their own. They will trust the provided binaries instead.

This is one of those "Why Johnny Can't Encrypt" (http://www.gaudior.net/alma/johnny.pdf) situations. I'm not solely criticizing spideroak here, I think this is a more general open problem with any 3rd party service that's meant to be trusted. It's an oxymoron that's been only somewhat breached because of public key cryptography.


Windows and OSX devices are already trusting a huge lot of binaries compiled by companies that we already know that have been compromissed.

Or, in other words, nobody that cares about privacy are using those anyway (at least while they care about privacy), by definition.



Checking for encrypted packets wouldn't tell you anything about the binaries' trustworthiness. A back door might just encrypt the data with a second key, or more specifically, encrypt the key that encrypts the data with another key. A reliable way of testing binaries doesn't seem very feasible to me. It's like antivirus vendors trying to find new viruses: the malware authors can always obfuscate their code just a little more, do it just a little bit differently, and now it does the same thing while escaping detection.

Authors of open source software who want to distribute trustable binaries should include instructions for how to reproduce the binary exactly from the source. A third party verifier could reproduce the binary, then publish a digital signature affirming that they reproduced it, allowing anyone who doesn't want to compile it to check with a trusted third party.

But all of that is a moot point if the source code isn't being very carefully checked.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: