Digital Conflicts is a bi-weekly briefing on the intersections of digital culture, AI, cybersecurity, digital rights, data privacy, and tech policy with a European focus.
Brought to you with journalistic integrity by Guerre di Rete, in partnership with the University of Bologna's Centre for Digital Ethics.
New to Digital Conflicts? Subscribe for free to receive it by email every two weeks.
N.13 - 10 September 2024
Author: Carola Frediani and Andrea Daniele Signorelli
In this issue:
Telegram and France
Is Telegram Really Privacy-Friendly
If You Want Privacy, Use Signal
The Telegram Case
August certainly ended with a bang for those who follow platform developments, with the largely unexpected arrest of Pavel Durov, the 39-year-old founder of the Telegram app. In short, the Russian-born co-founder (who also holds Emirati, French, and Saint Kitts and Nevis passports) of the messaging platform (now almost a social media platform) was arrested as he stepped off his private jet in France. He was held in police custody for 24 hours, extended to 96 hours, and finally released on Wednesday, August 28, after being formally charged with a series of still vague charges. These charges seem to focus on the idea that Telegram is not cooperating enough with investigations into specific criminal activity on the platform. Durov was released on bail (set at 5 million euros), cannot leave France and must report to a police station twice a week.
France and Europe
European regulations have nothing to do with this case, however the EU Commission spokespeople felt compelled to clarify this. The initiative came from France and the Paris prosecutor's office. "Telegram's almost total lack of response to judicial requests was brought to the attention of the cybercrime section (J3) of the national organized crime court (JUNALCO) at the Paris prosecutor's office, in particular by the national office for minors (OFMIN)," reads a statement from the prosecutor's office, which includes a list of charges.
It all began in February 2024 when the Paris court opened a preliminary investigation and tasked OFMIN to conduct it. The Center for Combating Cybercrime (C3N) and the National Anti-Fraud Office (ONAF) subsequently took over the investigation.
According to Politico, Durov’s first issues stemmed from a separate investigation into child sexual abuse, in which a suspect allegedly used Telegram to lure underage girls by threatening to distribute CSAM (Child Sexual Abuse Material) on social media, including revealing a rape. However, when French authorities requested information to identify the suspect, Telegram reportedly did not respond, leading to a preliminary investigation into its reluctance to cooperate with law enforcement in a criminal matter.
Although Europe is not involved in the French investigation, it could soon step in thanks to the DSA, the Digital Services Act. Since February, the new law requires all platforms in the EU to protect users from illegal and harmful content. However, platforms with more than 45 million active monthly users in Europe are subject to stricter obligations and are regulated directly by the Commission (instead of national authorities). Telegram, which has come under scrutiny for its social media-like features (with groups and channels hosting thousands of users), had previously evaded stricter controls by claiming in February to have only 41 million users across the 27 EU countries. So Telegram should be overseen by Belgium's telecom authority BIPT as the so-called Digital Services Coordinator. But the Commission remains unconvinced.
According to The Financial Times, "Telegram said in February it had 41mn users in the EU. Under the EU’s Digital Services Act (DSA), Telegram was supposed to provide an updated number this month but did not, only declaring it had ‘significantly fewer than 45mn average monthly active recipients in the EU’. The failure to provide the new data puts Telegram in breach of the DSA, two EU officials said, added it was likely the EU’s probe would find the true number was above the threshold for ‘very large online platforms’. Such a designation brings greater obligations for compliance and content moderation, third-party auditing and mandatory data sharing with the European Commission".
Encryption
The charges also cite two laws regarding the import and provision of cryptology services/tools, for which documentation (a declaration of compliance) appears to be missing.
On a legal level, these French regulations are discussed here: "The supply, import and export of cryptology means in and from France are subject to a prior declaration or a prior authorisation of the French National Cybersecurity Agency (ANSSI), depending on technical functionalities and commercial operation (provision or import)".
This angle seems secondary for now, although some American media outlets, as noted by US journalist Marcy Wheeler, continue to discuss it as if it were a crime related to the use of encryption, when it appears to be about a registration issue. "Signal, easily the most protective encrypted messaging app, did register under this law when it first applied to offer Signal in French app stores. So, no, [Signal, unlike Telegram, is] not going to be prosecuted under that law, because they’re following the law".
The journalist's reference to Signal stems from comments made by one of Signal's lead developers (now at Apple), Frederic Jacobs, who recalls the bureaucratic task of getting the app published on the French App Store. He adds, "Good reminder that France is one of the rare countries in the world to have a declaration obligation when **importing** cryptography. While one doesn’t need approval, it is critical to file an accurate declaration of the encryption system to the Cybersecurity agency ANSSI. According to prosecutors, Telegram failed to accurately complete its declaration".
What Does Signal Say?
It’s worth quoting a response from Meredith Whittaker, president of Signal, to journalist Andy Greenberg (the whole interview is worth reading) after Durov’s arrest in France.
Greenberg asks Whittaker, who had just been in France and seemed interested in Europe as a possible base for Signal: "Does it really make sense to look for that kind of jurisdictional flexibility in Europe when Telegram founder Pavel Durov was just arrested in France? Does this give you pause about Signal’s future in the EU?".
Whittaker replies: "Well, to start: Telegram and Signal are very different applications with very different use cases. Telegram is a social media app that allows an individual to communicate with millions at once and doesn't provide meaningful privacy or end-to-end encryption. Signal is solely a private and secure communications app that has no social media features. So we're already talking about two different things.
And as of today [August 27, 2024] there are simply too many unanswered questions and too little concrete information about the specific rationale behind Durov’s arrest for me to give you an informed opinion. On the broader question, let's be real: There's no state in the world that has an unblemished record on encryption. There are also champions of private communications and expression everywhere in the world—including many in the French government and in Europe beyond. Those of us who’ve been fighting for privacy for the long term recognize that this is a persistent battle, with allies and adversaries everywhere. Trying to prioritize flexibility is not the same thing as idealizing one or another jurisdiction. We're clear-eyed about the waters we need to navigate, wherever they are. We see a huge amount of support and opportunity in Europe.
And there are really big differences between states, even in Europe. Germany is considering a law mandating end-to-end encryption, while Spain has been at the tip of the spear on pushing for undermining encryption. So again, it's not a monolith”.
Telegram and Durov’s Response
In recent days, there has been some discussion about the language used in Telegram’s FAQs, as some recent changes have been noticed. Specifically, a sentence that used to state: “All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them” was reportedly removed. However, others point out that the same sentence still appears in other parts of the FAQ.
More interesting is Pavel Durov’s statement on his Telegram channel (signed as Du Rove, a playful twist on his name, which reportedly got on his controversial French passport, as Mediapart explained).
In essence, Durov seems to acknowledge that Telegram can do more to combat criminal activity, stating that it is not an "anarchic paradise" and expressing a willingness to engage.
Durov (or Du Rove, if you prefer) writes: "However, we hear voices saying that [what we’ve done] it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard".
Is Telegram really privacy-friendly?
Privacy is often praised when discussing Telegram, but this reputation is highly overrated. Not only does Telegram collect metadata about users (such as who is messaging whom, how many messages were sent between the two, where, and at what time), but more importantly, unlike Signal, WhatsApp, and iMessage, it does not encrypt communications with end-to-end encryption by default.
Telegram's popularity is largely tied to its hybrid nature, functioning as both a messaging app and a highly permissive social network in terms of the content shared within it. Unlike WhatsApp, Telegram allows the creation of public channels and thematic groups that can be shared via links, but unlike a regular social network, it is much more tolerant of illegal or hateful content than Facebook or Instagram.
“Telegram looks much more like a social network that is not end-to-end encrypted,” John Scott-Railton, senior researcher at Citizen Lab, told The Verge. “And because of that, Telegram could potentially moderate or have access to those things, or be compelled to.
A lot of controversial web activity has moved to Telegram because of its permissiveness, making some people to compare it to other spaces like 4Chan or the Dark Web. But is that a fair comparison? Only partially. Telegram has previously shown that it is not insensitive to its abuse, and has autonomously shut down some of the app's most violent groups.
More recently, Telegram has blocked extremist channels following media revelations, while the Stop Child Abuse channel, which fights the sharing of child sexual abuse material, claims that the platform has blocked over a thousand channels involved in this activity (though some believe that these channels are merely hidden, making them harder to find, rather than removed entirely).
So, Telegram is permissive toward illegal content, but it occasionally moderates it. Additionally, it does not encrypt end-to-end all communications, which would prevent it from providing any information to law enforcement, as already happened with Signal, but it refuses to do so. "Because Telegram does have this access, it puts a target on Durov for governmental attention in a way that would not be true if it really were an encrypted messenger”, said Scott-Railton.
Telegram has long operated on a razor's edge, attracting privacy-conscious users without implementing the kind of robust encryption that would protect all users and the platform itself. When child abuse or terrorism takes place in plain sight, platforms have a clear legal responsibility to moderate such content. This is true in both Europe and the United States.
Paradoxically, Durov's problems may be due in part to not protecting privacy enough, rather than protecting it too much. If all communications were encrypted end-to-end, some of the charges would be less valid, or even inadmissible. And this is all the more true because, of course, encryption is not illegal.
There is another problem that probably worries the CEOs of social media and messaging platforms: a lot of crime certainly happens on Facebook and other major social networks. And while Facebook and others try to moderate content, they don't always succeed. Since Telegram has also partially blocked illegal content, albeit poorly, one has to ask: how much lack of moderation is required before someone is arrested?
If You Want Privacy, Use Signal
Beyond Telegram's features, Durov's arrest has brought privacy back into the spotlight, making it a target once again for those who believe that privacy of communications is primarily for the protection of criminals and extremists, overlooking how important privacy is for protecting political dissidents in undemocratic countries or for protecting communications between journalists and their sources. Even the European Commission has recommended that its staff use Signal, a much more secure app than Telegram.
However, Signal too has often come under fire. Applications that prevent law enforcement from accessing messages and user metadata have been criticized not only by authoritarian regimes like China (which blocked the app in March 2021), but also by some politicians in democracies like the United States or France and even some children's rights organizations.
Signal also protects those who use privacy for far more noble reasons: Hong Kong activists, Black Lives Matter protesters, and anti-coup protesters in Myanmar are just a few of the groups that have used the encrypted app to communicate and organize. The United Nations itself has recommended using Signal to send journalists and NGOs evidence of abuses by totalitarian regimes.
For better or worse, Signal (which has about 70 million users compared to Telegram’s 1 billion and WhatsApp’s 2.7 billion) is the platform of choice for those who want to keep their conversations and metadata private.