Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you advertise a browser as being all about privacy, you absolutely should not be a/b testing users without their explicit consent. It is amazing to me that this needs to be said.


Okay, why? What does a/b testing have to do with privacy?

If I told you right now that HN is A/B testing the size of the vote arrows, and you see larger ones than I do, how exactly are your privacy rights violated?


A/B testing implies extra tracking and reporting of behavior, to distinguish between the A group and the B group's response to the change.

On a website, that isn't too much of a problem, because you expect your activity to be tracked since you are interacting with a web server.

In a browser, the same tracking is more intrusive. My behavior on a site is now being sent to the browser author, an unrelated party.


> My behavior on a site is now being sent to the browser author, an unrelated party.

But you're just assuming it is. It could also be "On uninstall and crash, report back which studies were enabled" to track how many people are uninstalling their browser when the flag stupid_feature="on" is set.

Be mad about the data collected [specifically, what type of data is collected], not about the concept of a/b testing. Set clear boundaries of what is and isn't acceptable. Otherwise you're fighting an ever-uphiller battle. Because if they want to know which sites you visit and the privacy boundary is "a/b testing isn't acceptable", they'll do it via history syncing instead.


> But you're just assuming it is

Of course we are. We didn't consent to being experimented upon or to have our behavior analyzed, so we have no basis for trust.

> Set clear boundaries of what is and isn't acceptable

Acceptable depends on trust. Ask for permission, not forgiveness, and I am more inclined to trust. Without that, I assume the worst.


You're not assuming the worst if your assumption is that "some third-party I distrust will respect the checkbox they themselves control".

Mozilla has arbitrary code execution rights on your computer the moment you allow them auto-updates. They're also entirely free to ignore your opt-out or opt-in requests.


That's not a analogous situation. You can inspect and control what the code does on your device, but you can't inspect what is done with your data after it leaves it.


Yeah... no. If your claim is that the code on your own device can just be inspected, then you can inspect what data leaves your device instead of assuming Mozilla hoovers up everything including your porn stash, credit cards and the weed hidden in your cd tray.


Yeah... yeah. What data goes out and what is done with it after it leaves are absolutely not the same thing.


> Be mad about the data collected, not about the concept of a/b testing.

What does this sentence mean? A/B testing fundamentally requires data collection, so I don't understand how to be mad about the data collected but not A/B testing.


I think they're saying to be mad about what data is collected. For example, be mad if they're storing your IP address and location and not be mad if they're just storing a counter of how many people visited their marketing page.


A privacy first browser should be completely up front about what data is being collected. The simple fact that they did while implying something else means that they arent to be trusted to not abuse the privilege in the future.


That makes sense.

Don’t be mad _that_ data is collected, be mad about _what specific_ data is collected.

I don’t know that I agree, but it does help me understand the meaning of the previous post. Thanks!


> My behavior on a site is now being sent to the browser author, an unrelated party.

What is the consequence of this? In the abstract I understand that it doesn’t feel great, but I’m asking in the specific, how does it make a difference to any individual user?

If you knew that the user in the A group made fewer misclicks after the back button was shrunk … how does that harm anyone?


You could say the same thing about pretty much all adtech tracking, since nearly everyone would be unable demonstrate any concrete harm to themselves.

That's not the point. The point is unless you are doing data collection on anything but an opt-in basis, you are not pro-privacy. What you do with the data or why you collect it or if that collection is harmful is irrelevant to this principle.


That's just extra data points that you give away for no good reason. Realistically it will not amount to anything (like, I'm sure, absolute majority of data collected even by sleaziest adtech). But worst case scenario? You data gets sold to a medical insurer, deanonymised, fed into ML models, then used to justify increasing premiums based on inferred health info.


All A/B testing really implies to me is, at most, crash reporting, which shouldn't be much of a privacy concern.


I've never seen A/B testing used for crash reporting. Perhaps it could be used for this, but it's flabbergasting to me that you see it as only for crash reporting.


You've never seen partial rollouts of a feature?


A browser A/B testing and sending analytics data back home is different from a website.

A _privacy marketed_ browser sending such analytics data back home is different from a regular browser doing so.


Obviously you'd need to know how they're measuring the effect. And I think that's the problem: Firefox doesn't make that easy enough. If Firefox gave on installation

* a short list of all the data they could collect, and

* a promise that they won't collect anything else without asking for the user's consent again,

then I doubt anyone would complain.


It needs to be said because some folks are blinded by their belief in the virtuousness of themselves or their in-group.

I may agree with the goals, but that doesn't mean I should blindly trust them to execute them, right?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: