It's interesting that it seems the trial hinged on the use of a password specifically. I'd guess the thinking was: password protecting it implies you shouldn't access it without authorisation, he worked around this authorisation, therefore it was illegal access.
This strikes me as a naive, if understandable, viewpoint. The average person on the street I'm sure gives special consideration to passwords as a concept, but from a security perspective they're just entropy. Engineers frequently use things like a "sufficiently random string" in a URL as a pseudo-password, or a username only on HTTP basic auth, or API keys that aren't "passwords" but are "keys", all of these are the same concept – unpredictability.
This is obviously a sad outcome for the researcher, and for the German cybersecurity industry, but I'm also surprised that the court was happy with such a shallow interpretation of security, as it theoretically opens the door to types of misuse that don't depend on passwords to be defended, and may prevent legitimate uses that happen by chance to depend on passwords. The boundary seems to have been drawn in a place that won't be useful for anyone.
This is an old, old interpretation, and it is correct. You can't base law on a sliding scale like entropy. The value of the password is symbolic, not that it's easy or difficult - anything beyond a password is clearly private.
I don't love metaphors very much, but doing anything other than that opens the door to absurd defenses like "your honor, he left his wallet on the table when he went to the bathroom, it's obvious anybody can legally pick it up". Not to mention a cop favorite which is unfortunately in use: if you dare report a robbery, they'll fine you for not having your security system up to date. Yes, it's a thing. Does wonders for their closure rates.
It's not the height of the fence that makes it theft - it's that there is a fence at all.
I see your point, but I think maybe I didn't make my point very well.
To follow your fence analogy, they've ruled that any fence is important, but ignored brick walls and rope tied between posts – i.e. they've picked something overly specific, but poorly fit to the behaviour they are actually trying to classify.
While I'm sure that ruling against fences doesn't rule out a future ruling against walls or ropes, it probably complicates getting those rulings in the future, as one could use a defence of "if they wanted to protect it they should have built a fence" to imply security.
An edge case would be a hidden URL. Publicly accessible, but not publicly available. That's more or less the same as a rope tied between posts, with the posts maybe felled by the wind. It's a good argument that it's a misunderstanding whether it's private property or not.
A password is not that.
(and to be frank, to anybody in the know even a passwordless phpmyadmin is clearly private)
I assume that this situation is why government systems will often have a notice that says something like “absence of access controls is not authorization for access.”
First link is actually extending privacy outside the house. So it's very much not making fences fuzzy.
The second link is basically about the wavelength you use when looking at the outside of the house. Not surprisingly, it doesn't matter. This is common sense, not a revelation.
A challenge to my comment would be if a house had a picket fence, somebody came by and took a hoe from the yard, and a judge said "nah, it's not theft, the fence was too low, all he had to do was reach out and take it". That would be a proper counter-argument.
You say essentially state that my information is not a “proper counter-argument.”, while in the same response argue that I am wrong because this is “this is common sense”, rather than any real response.
The court recognized there was a fence for privacy, but that is was applicable to certain situations, for example, the purpose of the recording. So it would be fine to record someone if they were in the background of your picture, but not if was for sexual purposes. Hence the numerous criteria to be considered.
So its very much about making fences fuzzy, since they only apply to certain types of filming, the purpose, personal attributes....etc.
Thermal imaging and its effects on the fourth are not common sense. Thermal imaging was initially ruled to be fine, then appealed, ruled fine, then it ruled to not be fine, but only with a 5 - 4 ruling. With the dissenting opinion including “Heat waves, like aromas that are generated in a kitchen, or in a laboratory or opium den, enter the public domain if and when they leave a building.”
All you would have to do is read the Wikipedia article I gave to see that “Scalia's phrases "sense-enhancing technology" and "device that is not in general public use" in the Kyllo ruling have become influential in later rulings on police search procedures, but in an inconsistent fashion.[22] Several scholars and legal analysts noted the ambiguity in Scalia's use of those phrases.
To use your example, a police officer could “reach over” and see the illegal activity with his own eyes through a hole in your fence and that is legal. However, they could not use binoculars to get over your fence from a public area.
I'm not at my best to argue this, since it's evening here and I had a couple of glasses of wine with dinner. But my first instinct is to comment on how the case you're describing is at the edge of the law, where things are naturally fuzzy, as most borders are. So we're talking about an edge case of a metaphor - quite a bit removed from the main topic.
I stand by my previous example - a proper counter-example would be somebody reaching over the fence and stealing something. My main argument is that any password signifies a fence - be it a tall wall or a picket fence. Going beyond it is clearly not ok. Law and custom carve exceptions for white hats looking for security flaws - but both law and custom also specify how those white hats are supposed to behave. And a 3 day ultimatum followed by public disclosure is not it.
If you hire someone to store your valuables, walk over to visit them, and see a closed door, I don't think it is unreasonable to try the door knob to check if it's locked or even give the door a good push to see if it feels solid. And if the door swings open, it's further not unreadable to poke your head in to see if your valuables are secured inside.
1. He was not hired by them, but by a third party. Otherwise it would have been breach of contract, not unauthorized access.
2. There's a big difference between the door of a room once you're inside the house, and the front door. No, it's not ok to just open front doors to take a look around.
3. The valuables in this case are information. "Taking a look" is all you can do.
No no, he wasn't hired by them, they were hired by him (or, rather, his org).
His org had keys to the front door. His team had reported that their designated closet seemed to have a bunch of other people's stuff in it. He walked in, and instead of walls, the closet had curtains. He stuck his head through the curtain and saw other people's shit, then pulled his head out and reported it. Then he got fined.
In any case, password or not is not the problem. EU law actually has protections for white hat hackers, even when they actually hack. I do have a friend who does this regularly, he's been in the press, he's been sued, he knows the drill. There are a number of steps to follow, and a 3-day wait period is patently absurd. A proper process looks absolutely nothing like this. Amongst other things, my friend is actually contacting the authorities before even considering public disclosure.
The point of a white hat hacker is to help, not to make things worse.
I don't particularly disagree with the verdict. He was not hired by them, he did disclose the issue publicly, and the 3-day fix schedule is hilarious. And the 3000 eur fine is more like a slap on the wrist. I actually know an ethical hacker, and the process is quite different - the "deadline" is more like 3 months, and he always contacts the authorities a long time before anything has a chance of going public.
As for the company denying the issue this means nothing. It's reflex due to liability - GDPR exposes them of fines of millions, and an email saying "ups, we fucked up" is a quick shortcut to that.
> [...] police arrived at the researcher’s residence on September 15, 2021, “gained access to the apartment and pushed him against the wall. The police confiscated a PC, five laptops, a cell phone and five external storage media - the programmer's entire work device.”
This is the scary part. Total value confiscated is over 3000 eur, and the disruption created is even more than that. And this happened _before_ any conviction. THIS is what we should be up in arms about.
From what I understand, confiscating phones and keeping them for the duration of the investigation is becoming, if not standard, at least moderately common. This is punishment, not investigation.
> As for the company denying the issue this means nothing.
It doesn't mean nothing - if the company themselves claim it's not an issue, then they're admitting he didn't hack anything and didn't expose them to any risk by revealing the vulnerability. Because according to the company themselves, it "means nothing", so he just took them at their word. They can't have it both ways, claim it was nothing to avoid liability themselves, while also claiming it caused them great harm and he should be prosecuted.
I will, perhaps, have a tiny bit of sympathy for the company, when attacks by companies on consumers via undisclosed spyware (they call it "telemetry") are treated in the same "raid the premises and confiscate everything" manner by the authorities.
I'm not sure if you're making an argument or just ranting. But in case it's an argument: the best behavior for the company is to say "ups, we fucked up", work with the hacker to fix the issue and maybe send some cash his way. The second best behavior is to deny the issue and work on fixing it. We didn't have a chance to see that, because 3 day deadline. It's even possible the initial denial was by a non-technical person, and it took more than 3 days to even escalate to somebody able to do something about it. We're not talking about just a guy with a laptop here, fixing this would most likely involve multiple people, both technical and non technical. Quite possibly somebody approving a budget and so on. That's why I said a 3-day deadline is hilarious.
I had $10k of gear confiscated. Took them eight months to find nothing, then I had to go pick it up. This was in Australia.
Police forces are grossly under educated about anything related to computers and the internet, and ironically the people that suffer at their hands are the actual knowledgeable ones.
Are they? I think it's more of a policy issue. When I said phones are starting to be confiscated by default I didn't even mean tech crimes. Many incidents nowadays can be better investigated if you have access to messenger apps and social media - so it's quite tempting to just seize the phones and study them leisurely. If we don't want this as a default for any investigation, we probably need to make some noise about it.
Anecdotal aside: all my computer shit that was taken and they didn't confiscate my phone - and I'm thankful that was the case in terms of the ability to conduct day-to-day life, but it's also a bit confusing. I wasn't allowed to touch the phone (or any other of my tech gadgets) whilst under questioning during the raid, and I'm sure they spent some time either imaging it or looking its contents, but they didn't take it with them - nor any of my old phones.
It's confusing, because many of the 'cases' I read about in the news mention that the kind of content they're looking for is discovered on phones.
Maybe they figured the terabytes of data storage they did take (my NAS) would surely yield enough results.
I can explain, a bit, as I recently did a bit of research in this area, particularly Section 202 (Section 202a, Section 202b, and Section 202c) of the German Criminal Code (Strafgesetzbuch, StGB) which addresses the unauthorized access to data. Disclaimer: I'm not a lawyer (but I do have one).
In Germany this is taken very seriously.
- 202a deals with secret interception of data which was not intended for the interceptor
- 202b makes it illegal to access data within a system without the necessary authorization. I think this also means (I'm sure someone out there can correct me if I'm wrong) that even if the accessor knows the password and has even been given it, but does not have the explicit granted permission nor authority to use it to access the system, it is a crime. I'm pretty sure (as a layman) that this is what this case is about. Perhaps this and a bit of 202a. 'Goodwill, whitehat' etc will not help here. Lucky he had no criminal priors and Germany is not generally a fan of throwing people in prison.
- 202c is mostly about stealing data, making it public, or using it for (a) or (b).
Is there not wider EU regulation dealing with responsible disclosure? I know for a fact that in Romania you can do that, if and only if you follow certain steps (definitely not public disclosure after 3 days, obviously), and I _think_ the source is EU regulation, not local law. IANAL etc.
> The court convicted the researcher, calling into question whether accessing software with weak password protection through readily available methods constitutes hacking
> Ultimately the court sided with the prosecution, finding the researcher guilty of hacking
So which one is it? I haven’t read the actual court documents, but this is confusing.
While this is a disastrous ruling, it also will very likely not stand in the higher court that will handle the appeal. But great that our idiotic hacker tool law from 2007 is finally getting some international exposure.
From what I heard the (idiotic) law was interpreted correctly.
There might be a final chance with the European Court of Human Rights if everything else fails.
But that easily takes many years. The right thing is to show your government how stupid the law is (harmful for German's security), to change it quickly and retroactively and compensate the researcher (I'd also say punish the company seriously, but that can't be applied retroactively).
when you find a hole, you go sell it on the darknet.
governments, the justice. they will F you up if you try to play white knight.
seriously, that security researcher just got what he deserved.
you have to sell that data on the darknet. and F them up again, and again, and again until their asses bleed to much they will give FULL protection to security researchers.
[0]: https://news.ycombinator.com/item?id=39046838