Will you cooperate if you receive a legal warrant from China, Russia, Iran or North Korea?
Also, Telegram is accused in other things, like importing uncertified cryptographic tools. If you go to Gtihub, there are large deposits of uncertified cryptographic tools. Why ssh or openssl developers are not arrested? I am sure they do not have a proper certificate.
As I guess, this is a typical case when the law enforcement bends the vaguely written laws. A law saying something like "it is illegal to assist commiting a crime" is a typical example of such vague law: until there is a court decision you cannot even know if some event was legal or illegal. And "assist" is also a very vague term.
When Snowden was escaping capture form the West, err, I mean, US, we, the US, err, the Obama admin, wanted to know which plane he was in to see of they could have it "diverted".
Also, don't pretend to "go hiking" along the Iranian border...
I don't know if I should be sad that the West is trending towards treating humans as China/Russia/Iran/NK do, or if I should be happy that geopolitical powers have one less thing they disagree about.
Probably no but I also would expect to face consequences if I go to the country whom court order I choose not to fulfill.
It's the same thing with the USA in the crypto community, you either don't sell US citizens unregulated securities or you sell and avoid USA or any country that might extradite you to USA.
I remember the days when the internet was the wild west, laws wouldn't apply and the cops lacked the tools to catch anyone or enforce anything. Those days are long gone, the rule of the thumb is that countries now have rules and tool for this stuff and if you break them better never ever visit these countries.
>It's the same thing with the USA in the crypto community, you either don't sell US citizens
When I read that at first, I thought crypto was short for cryptography, and you were referring to the crypto wars, back when most cryptography was classified as arms (weapons) in the US and it was illegal to distribute cryptographic software in many cases.
One of my instructors in college attended a US college, but during his PhD would drive to Canada to write all the code, so that the code could be legally distributed. If it was written in the US, it couldn't be.
Yes, US does have export regulations on cryptography and you can indeed get in trouble with the US for math, App developers who distribute through App Stores would go through the paperwork of it to stay compliant.
I propose that we as participants will have at least a minimum level of sophistication and acknowledge that yes, the law in France, as a democratic country with a rule of law with an independent judiciary, the law is, indeed, somewhat special or superior compared to a dictatorship like China.
If we cannot reach this (imho very low) bar of understanding, we have no chance of a fruitful conversation. If we would continue pretending that this is an earnest stance, we would normalize, justify and enhance autocracies, which we don´t need here.
One might be better of to try furthering these kind of view points in China or any other dictatorship of choice.
Laws are nothing more than rules that are - usually created and - enforced by entities with the power to enforce them. There is no comparatively objective superiority/inferiority of any particular subset of laws.
Then there are no morals at all with that particular take on how laws, morals, and the rest are formed. Equality, personal sovereignty, etc mean nothing because they are just constructs of some human’s mind that somehow got mindshare; all meaningless under your guidelines.
That's the thing though. Morals are subjective, not objective. But also morals are orthogonal to laws.
Consider a law banning the eating of meat, vs one dictating that meat should be eaten at least once per week. One of these laws must be enacted. Which one is superior? Obviously (I think), the vegetarian would consider the former superior, and the meat lover the latter.
But we can also look further into environmental differences. Given an hypothetical area where animals are abundant and plants aren't, the latter law would generally be considered superior. In an area where this is reversed, the former would be. Ultimately these will be subjective values based on the actual situation.
And there's always the power dynamic that "trumps" situations. A vegetarian gaining the power to make laws in a vegetatively poor area will obviously not - necessarily - lead to the latter law being enacted. Even if the majority of the population is made of meat lovers. And in a world dominated by vegetarians, the many will applaud and say the ban on meat is the "right thing" for those in said vegetatively poor area.
All that is just an elaborate illustration to say that yes, equality, personal sovereignty, etc really are - subjective - human constructs. And any laws surrounding them will be considered superior/inferior based on how these constructs are valued.
In France if you operate a telecommunication network, even encrypted, and your network is used to conduct illegal activities, you will end up with the police taking you into custody to ask you for details.
The result of this is that users, especially any the government might be interested in, would use the services that don't keep any information. Which in general is good -- services shouldn't be keeping data for no reason -- but sometimes there's a reason. Maybe an ordinary user would like to have a cloud backup, but not if it could create legal liability (and then their lawyers order them not to do it).
So we should be asking if this makes sense. If the government can snatch the data from third party services without a warrant then the people interesting to them will use the ones that don't store it, and then the government can't get the data with only a subpoena anyway. They may not even be able to get it with a warrant. But it also means that people can't have any features that require a trusted third party if they're concerned about legal liability from e.g. prosecutors taking things out of context, which should be everybody because they do that all the time.
What you're saying in effect is that any random country should be able to subpoena records for anyone. To be logically consistent, what if say Iran wants the records of the head of the CIA? The concern is not limited to Telegram; it generalizes to any messaging app.
There is no logical inconsistency here. Durov, a citizen of France, failed to comply with their own country's process. This isn't a "random country" applied "for anyone".
One will find that countries don't really distinguish when it's one of their citizens taking action and those citizens are within the country's jurisdiction.
Governments are funny like that. If I go to Germany and kill a man in the street, I might not end up under arrest if I go to, say, Russia, but if I go back to my home country? Yeah, they'd arrest me for a crime committed elsewhere in someone else's jurisdiction we're allied with.
Edit: By "operates" I mean its services are accessible from France. If an entity cannot or will not comply with EU laws (e.g. GDPR) they can block clients from the EU. I regularly come across such websites with geoblocks (a few times a year).
You can access substantially all internet services from France, and from any other country that doesn't explicitly block them, and from most of the ones that do via satellite internet or VPNs etc.
> you can create an account from a French number.
You can have a French number and not be in France. Also, this is the location of the user, not the company. This is basically the same reasoning as the internet; the phone network is global and anybody can call anybody. That doesn't mean that every service provider is in every country.
> Also the website is in French and the illegal messages are likely in French.
There are millions of people who speak French who aren't in France.
> Now add on top, that the CEO is French on French soil...
But we're talking about where the company is operating. Should France arrest the French CEO of a US company that does something in the US which is legal in the US but not in France?
For internet websites, if a website can be accessed from France, then the French law applies.
It's under the same principle that Megaupload was shut down by the US, despite the company was not connected to the US or operating from the US.
This is why some websites block visitors from France (e.g. newspapers not willing to bother with GDPR for example) or that some sites block visitors, registrations and payments from the US (e.g. investment websites).
Here Telegram refuses to respect French law. Whether they are ethically right or not, it's one debate, but in the meantime, the detectives want to gather elements.
It's the standard way that police talks there. They don't send lawyers. Instead they put you in custody for interrogation.
A bit rude, I recognize it, though relatively common. Even the French CEO of Uber was sent in custody there.
> The law is quite simple (and it's not just in France), if a website can be accessed from France, then the French law applies.
The internet is global. Any website can by default be accessed from any country. Are you proposing that anyone with a website is subject to the laws of every country? Notice that this is impossible to do; the laws of different countries will require mutually exclusive things.
> It's under the same principle that Megaupload was shut down by the US, despite the company was not connected to the US or operating from the US.
You're citing an extremely controversial practice to justify the same. The question is what should be the case and is reasonable, not what some country has managed to get away with through some questionable shenanigans or might makes right.
> At least to show: "we tried our best".
This is just a fig leaf. IP location services are notoriously unreliable and trivially bypassed. Anybody in any location can choose the IP address they make a request from. Also, fragmenting the internet in this way is poison and should be discouraged as a matter of policy. You're effectively asking websites to block foreign countries by default because they don't have the resources to hire a lawyer for every country that exists to see how to comply with their laws.
The ship has long sailed. EU decided that our laws will be applied maximally - to all EU citizens regardless of their location and for everyone within our territorial control.
And if other countries like USA have conflicting laws like Patriot Act then companies can be even forced to divest from EU or split their businesses so that USA branch can not enforce USA spy laws in EU. Google Privacy Shield.
> The ship has long sailed. EU decided that our laws will be applied maximally - to all EU citizens regardless of their location and for everyone within our territorial control.
Ships that have sailed can still be sunk.
> And if other countries like USA have conflicting laws like Patriot Act then companies can be even forced to divest from EU or split their businesses so that USA branch can not enforce USA spy laws in EU.
How is this supposed to apply to a small business with one employee?
The internet is globally accessible and that used to be considered a good thing. If France or the EU want to build a Chinese-style firewall to block unapproved parts of it, that’s up to them.
If the head of the CIA stepped foot in Iran, you bet your butt that they'd be at serious risk of arrest. The only reason this wouldn't be the case is Iran not having Nukes.
How one country having "nukes" makes something more or less legal? I don't see the connection here. Crime is a crime no matter what is the nationality and position of a person. If critisizing Communist Party is a crime, it doesn't matter who commits it, everybody should be treated equally.
> Crime is a crime no matter what is the nationality and position of a person
Iran apprehending an American official crosses from criminality and law into the anarchy of geopolitics.
For example, the fact that we have nukes is probably one among many considerations a foreign power might weigh when thinking about kidnapping the President.
Government communications are typically exempt from subpoena by foreign courts. Telegram being available to the global audience would be in a different category. I doubt the head of the CIA uses it.
EU has been doing a lot of really bad stuff regarding the freedom of private communication recently (last few years), but they still manage to complain about others.
It's amazing how many on HN are actually closet authoritarians. It was made extra clear during the pandemic and becomes clearer with every one of these posts
> Signal being unable to provide information to a subpoena is very different than arresting its CEO
If Signal had blown off the court completely, yes, that would have resulted in arrest warrants. That's at least what France alleges Durov did in the warrant.
On the flip side of the coin, it's amazing how many on HN are public anarchists. When we come together as a society to create laws, we make them for a reason, and nobody is above them.
I'll take a public anarchist over secret authoritarian any day. Also, your implication that anyone who questions authority or these charges somehow is an anarchist or believes some should be above the law is laughable
> seen far fewer anarchists running around subjugating people, giving people Syphilis to see what happens, feeding orphans radioactive substance laced milk for the same, running off to war, imposing drafts, playing at proxy wars
Real anarchy does all of these things. It's why living under fighting warlords is a reliable precursor to authoritarianism: it's better than anarchy in the short term.
Some of these warrants and subpoenas are senseless and come from deliberate political prosecution though. This seems to be the case in France, not a good look at all, especially with these accusations.
But Signal did get it right too, this is basically malicious compliance. But making a state make a fuss is another.
Among one of the many charges... "Complicity - Detention of the image of a minor of a child-pornographic nature."
Him being charged is as simple and standard as an administrator being shut down and it's owner charged for propagating illegal content. As standard as a Tor drug site being shutdown and it's owner charged. Worse, Durov was involved with CSAM material and did nothing about it. For this, he is immoral and in my opinion, a disgusting human being.
Complicity - Administration of an online platform to allow an illegal transaction in an organised band,
- Refusal to communicate, at the request of the authorised authorities, the information or documents necessary for the realisation and exploitation of interceptions authorised by law,
- Complicity - Detention of the image of a minor of a child-pornographic nature,
- Complicity - Dissemination, offer or making available in an organised tape of images of a minor of a pornographic nature,
- Complicity - Acquisition, transport, holding, offer or disposal of narcotic products,
- Complicity - Offer, assignment or making available without legitimate reason of equipment, an instrument, a program or data designed or adapted for the attack and access to the operation of an automated data processing system,
- Complicity - Organised gang scam,
- Association of criminals with a view to committing a crime or offence punishable by 5 years of imprisonment at least,
- Money laundering of crimes or offences in organised gangs,
- Provision of cryptology services to ensure confidentiality functions without a declaration of conformity,
- Provision of a cryptological means not exclusively ensuring authentication or integrity control functions without prior declaration,
- Import of a cryptology means that does not exclusively perform authentication or integrity control functions without prior declaration.
Involved with CSAM means having a platform that openly has CSAM and not doing anything about it despite many warnings and requests in years. Do you disagree that is illegal? See my examples of others being charged...
> Involved with CSAM means having a platform that openly has CSAM and not doing anything about it despite many warnings and requests in years
You'd have to show that he knew about specific instances and declined to intervene.
I'm not saying that's unlikely. But I haven't seen it shown, and I suspect part of what the French police are trying to get is that evidence of wilful inaction versus gross negligence.
So in your view it's perfectly ok to run an internet service where you don't check for CSAM? You're entitled to that view I suppose but it's a minority view and allowing CSAM on your platform is illegal in most places (including the US where there's a carve out for that sort of stuff in the law that protects Google, Facebook et al) and in all of those jurisdictions the public is not going to be able to see the CSAM or be able to examine the substance of the allegations either before or during trial.
Every social network or file sharing site that I've been aware of has a Trust and Safety department for just this reason even X. The executives don't want to go to jail.
> So in your view it's perfectly ok to run an internet service where you don't check for CSAM?
Well, that's quite the assumption. The commenter you've replied to said nothing like this. And yet this is your first conclusion?? Is this how you operate in real life, at your job?
Telegram does moderate for CSAM. The claim that it does not is completely unsubstantiated. You can find CSAM across Meta's products. Does that mean they do not check for CSAM? No.
They ignore taking action when confronted with it. That's why Durov is a disgusting human being.
"the app had gained a reputation for ignoring advocacy groups fighting child exploitation.
Three of those groups, the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation, all told NBC News that their outreach to Telegram about child sexual abuse material, often shorthanded as CSAM, on the platform has largely been ignored."
Those are the charges and the core argument. Do you disagree?
"Telegram is a key component of the ecosystem of individuals trading and selling child sexual abuse materials, and is the only major platform to implicitly allow the exchange of CSAM on private channels, many of which are not end-to-end encrypted.” Stamos is now chief information security officer at cybersecurity company SentinelOne.
A June report from the Stanford Internet Observatory found that Telegram was the only major platform not to forbid illegal material in private channels and chats. “Telegram has also been observed by SIO as failing to perform even basic content enforcement on public channels, with instances of known CSAM being detected and reported by our ingest systems,” the report said.
Jean-Michel Bernigaud, secretary general of Ofmin, a French police agency focused on preventing violence against minors, said in a LinkedIn post Monday that Durov’s arrest was related to the app’s inability to deal with offensive content against minors. “At the heart of the case is the absence of moderation and cooperation on the part of the platform,” Bernigaud said, “especially in the fight against child sex crimes.”
Yes, of course I disagree. This is one claim among many. Specifically, this appalling characterization and accusation you made:
> Worse, Durov was involved with CSAM material and did nothing about it. For this, he is immoral and in my opinion, a disgusting human being.
You have no idea who Durov is. You have a handful of claims by French authorities without evidence and you immediately jump to loathing this person and reviling them as a "disgusting human being"? Shame on you.
I'm glad we agree that if the claims are true, he would be considered an appalling and awful figure. We'll see what the court decides, that's how justice works.
I trust the French government and several global monitoring CSAM agencies more than I trust a random Twitter or HN user.
You keep good company: the French government continues to shelter the actually convicted sexual predator, Roman Polanski, and sheltered Iran's first Supreme Leader, Ayatollah Khomeini.
You didn't provide the link to SIO report, but I assume this is it: [1]. The report is mostly dedicated to teenagers trying to find ways to sell self-filmed content. You cherry-picked claims against Telegram to make allegations look more serious than they are, and didn't mention that there are more serious claims against Western platforms.
This is a quote from the beginning of the report:
> Large networks of accounts, putatively operated by minors, are openly
advertising self-generated child sexual abuse material (SG-CSAM) for sale.
(by the way this might be because it is very difficult to find a legitimate job if you are a teenager without any natural skills and talents. Why doesn't government do anything to change this? Where are teenagers from poor families supposed to get money from?)
> Instagram is currently the most important platform for these networks, with
features that help connect buyers and sellers
> Instagram’s recommendation algorithms are a key reason for the platform’s
effectiveness in advertising SG-CSAM.
> Twitter had an apparent regression allowing CSAM to be posted to public
profiles, despite hashes of these images being available to platforms and
researchers.
Can we expect to see Musk and Zuckerberg in the same jail with Durov then? Or justice doesn't apply to everyone equally?
Note also that the report gives following recommendations in the conclusion:
> When an account is identified as selling SG-CSAM, disabling the account should be accompanied by messaging to the seller to attempt to discourage recidivism. This messaging might include:
> The fact that this content is widely illegal and can result in prosecution;
being a minor does not prevent legal consequences
So basically what reports suggests is not to do something to help teenagers from poor families to find a legitimate job, but to threaten them with a jail term for selling their own photos. So American!
> A June report from the Stanford Internet Observatory found that Telegram was the only major platform not to forbid illegal material in private channels and chats. “
If you read the report, this means that Telegram's ToS do not explicitly forbid to post illegal material in private groups. But do you need to forbid explicitly what is already forbidden by the law?
The report contains further claim though:
> It further states that “All Telegram chats and group chats
are private amongst their participants. We do not process any requests related to
them
This is alarming but this not exactly how it works because you can actually report messages even in one-to-one private chats, for example, if you get spam from a new contact, and they can get blocked. I never got illegal material from contacts (only spam) so I don't have experience reporting it.
> Telegram has also been observed by SIO as failing to perform even basic content enforcement on public channels, with instances of known CSAM being detected and reported by our ingest systems
If you read further, by "failing to perform basic content enforcement" they mean that Telegram doesn't check posted images again CSAM database, and imply that Telegram is obliged to do this. However, I am not sure if the law requires this.
Now I want to comment on other vaguely written claims.
> Telegram is a key component of the ecosystem of individuals trading and selling child sexual abuse materials,
What makes Telegram a "key component"? Did Durov designed Telegram and added features with primary intent to make selling illegal materials easier? This sounds implausible.
> At the heart of the case is the absence of moderation
Does he mean a lack of pre-moderation (reviewing every message before posting) or lack of response to reports? There definitely is moderation in Telegram, so the "absense of moderation" doesn't ring true to me. If would be good if they presented more details instead of vague words.
> absence of ... cooperation
"cooperation" is a vague word. Maybe France just wants to be able to read all messages in private groups under an excuse of fighting crime? This would be a completely different story then.
No social network is perfect but talk to me when Twitter or Meta ignore CSAM agencies repeated requests. I'll be waiting. Otherwise, Telegram is complicit.
"the app had gained a reputation for ignoring advocacy groups fighting child exploitation.
Three of those groups, the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation, all told NBC News that their outreach to Telegram about child sexual abuse material, often shorthanded as CSAM, on the platform has largely been ignored."
Ignoring reports of illegal material is one thing; ignoring invitations to join US-based programs or cooperating with them which Telegram is not required to do by law is different thing. The article for some reasons doesn't clearly states what it means; the author uses vague ambigious wording instead like politicians do.
The article mentions a 2023 report of SIO [1] on minors trying to earn money by selling their own photos online; the report mentions Telegram, but notes that Instagram and Twitter are worse:
> Instagram is currently the most important platform for these networks, with
features that help connect buyers and sellers
> Instagram’s recommendation algorithms are a key reason for the platform’s
effectiveness in advertising SG-CSAM.
> Twitter had an apparent regression allowing CSAM to be posted to public
profiles, despite hashes of these images being available to platforms and
researchers
Yet, for some strange reason Musk and Zuckerberg are not under investigation.
Note, that the report also doesn't give any recommendations to govts to help minors to earn money they need the legal way to solve the root issue.
They simply can't collect much. Plus keeping secret records would be massive legal liability, especially in the EU.
The protocol is public. Signal software is free and open-source. Its mobile clients, desktop client, and server are all published under the AGPL-3.0. Any modifications for the app binaries and protocols would be quickly noticed. Hackers and cybersecurity companies are not fools.
I don't understand why Signal is not pursuing the reproducible builds. It looks suspicious. Verification of a binary takes a huge effort and can only be done by knowledgeable people. Case in point: nobody noticed or cared about the lack of undisclosed binary updates of Signal without released sources.
we don't know what he has been asked. the french authorities are testing the waters and see how much they can control the guy.
given the uproar, it seems its not the time and so they claim they are only concerned about csam.
There is the right way and the wrong way.
Wrong way: Telegram keeps records and refuses to cooperate. Telegram faces consequences.
Right way: Signal cooperates. They give two Unix timestamps. One for when the account was created and the date the account last connected to the Signal service. https://signal.org/bigbrother/cd-california-grand-jury/ See also: https://signal.org/bigbrother/