Signaling To Ghost And Telegram: A Good Time To Talk About E2EE
On Tuesday, September 17th, international law enforcement revealed that it had infiltrated and dismantled the Ghost encrypted messaging platform. The crackdown and subsequent arrests come just weeks after the apprehension of Pavel Durov, the founder and CEO of Telegram. Both cases are still unfolding, with updates on Telegram‘s fate coming daily. As the revelations come, a debate over the place of end-to-end encryption (E2EE) and its relationship to numerous messaging platforms has found itself reignited in time for a major legal decision by EU authorities. Is this part of a pattern creating real reason for concern, or is it all tech-libertarian virtue signaling?
E2EE: A Hot Topic In Europe?
To recap: End-to-end encryption is a method of communication where only the sender and the recipient can decrypt and access the content of a message. The technology ensures that even service providers or intermediaries cannot read the message data. This security feature is particularly popular on platforms like Telegram and Signal, which use E2EE to safeguard user privacy.
Supporters of E2EE (I consider myself among them) argue that it is essential for protecting the privacy of journalists, activists, and ordinary citizens, while critics say it enables criminal activities by shielding bad actors from law enforcement. This debate has fueled regulatory pushes in Europe to incorporate frameworks like the ePrivacy Regulation (S'ouvre dans une nouvelle fenêtre), which claimed that it sought to reconcile the need for security with the right to privacy.
E2EE gets thrown around as a particularly contentious issue in Europe (S'ouvre dans une nouvelle fenêtre) due to the EU’s perceived increased commitment to protecting privacy via GDPR (the General Data Protection Regulation) (S'ouvre dans une nouvelle fenêtre) while having, at-times, more controversial limits on free speech for security in cases like combating extremism and terrorism. France (S'ouvre dans une nouvelle fenêtre) and Germany (S'ouvre dans une nouvelle fenêtre) in particular catch heat for this.
To an outsider, it feels like a contradiction: the rights of individuals to control their personal information is held with pretty substantial precedent in Europe. This is largely enabled by encryption, including E2EE, which has been coming under attack (S'ouvre dans une nouvelle fenêtre) in the name of increasing law enforcement’s capabilities. I’ll note that my personal feelings are that Europe is largely better on these things despite the reputation. After all, the US doesn’t have the greatest history (S'ouvre dans une nouvelle fenêtre) on that end, and has become a veritable wild west of data surveillance (S'ouvre dans une nouvelle fenêtre).
This conflict is evident in ongoing discussions about encryption backdoors (S'ouvre dans une nouvelle fenêtre), where governments seek to have a way to access encrypted data for investigations while maintaining the security integrity of these platforms.
Privacy advocates warn that any such access could open the door to misuse (S'ouvre dans une nouvelle fenêtre) and weaken the overall security of encrypted communications. Some advocates, particularly Meredith Whittaker, the CEO of Signal, state that this is simply a matter of simple fact, not just idealism.
Signal messaging application President Meredith Whittaker speaks during the 2022 Web Summit in Lisbon, Portugal, on November 4, 2022. (Photo by Pedro Fiúza/NurPhoto via Getty Images)
“Instead of accepting this fundamental mathematical reality, some European countries continue to play rhetorical games.”
-Meredith Whittaker, CEO of Signal
It needs to be noted that this belief that disabling E2EE is an entirely necessary move for law enforcement is not (S'ouvre dans une nouvelle fenêtre) even (S'ouvre dans une nouvelle fenêtre) a conclusion (S'ouvre dans une nouvelle fenêtre) backed (S'ouvre dans une nouvelle fenêtre) by research (S'ouvre dans une nouvelle fenêtre). It may even actually harm government and corporate capacity to comply with current GDPR standards. And considering the current state of the EU’s cybersecurity (S'ouvre dans une nouvelle fenêtre), I don’t know if lower standards is something that should be accepted.
So there you have it, right? Even if you’re a diehard anarchist, or want what’s best for ‘the system’, E2EE is good? End of discussion?
So Where Does It Stand?
EU legal bodies have maintained a broadly anti-E2EE stance (S'ouvre dans une nouvelle fenêtre), but any measures against it have not yet been put in place. There have been differing visions between EU ministers, heads of state, and law enforcement regarding where compromises should lie.
The most current disagreements revolve around an understandable topic (S'ouvre dans une nouvelle fenêtre): Child Sexual Abuse Material (CSAM). The assertion is that encryption will keep authorities from acting in time without some form of pre-encryption scanning or other method of checking uploaded material (with an as-of-now nonexistent technology). This has been the driving force of the prevailing arguments.
Most EU Council governments have asked for a backdoor in such cases, with Hungary (surprise) asking for an outright ban on E2EE. Viktor Orbán’s government has been leading the charge on these reforms while they maintain the Presidency role of the EU Council.
Germany, Poland, Austria, Estonia, Slovenia, and Luxembourg are the other countries allegedly still against Hungary’s push (S'ouvre dans une nouvelle fenêtre). The stated reason is that there hasn’t been compelling proof that a demonstrable balance can be struck between privacy, technical capabilities, and national security with such rules in place.
This could potentially change, as the EU Ministers will be meeting on the fate of such a “Chat Control” mandate in just a handful of days, on October 10th.
However, The Netherlands, who have been largely agonizing back-and-forth (S'ouvre dans une nouvelle fenêtre) on where they stand, suddenly announced on October 1st that they wouldn’t support such a measure (S'ouvre dans une nouvelle fenêtre), citing national security concerns.This may be related to the recent data breaches plaguing the Dutch public sector (S'ouvre dans une nouvelle fenêtre).
The Netherlands’ EDRi affiliate, Bits of Freedom (S'ouvre dans une nouvelle fenêtre), have been instrumental advocates for E2EE
By changing lanes, The Netherlands will essentially couch the plan for another year.
So Then Why Was Ghost Dismantled?
Ghost was originally marketed as a highly secure encrypted communication platform. Unlike freely available protocols and platforms such as Signal, Session, or Tox, it was operated in a for-profit, clandestine manner that marketed specific mobile devices for the purpose. For reference, the “privacy phone” market rarely goes well, and is usually a scam (S'ouvre dans une nouvelle fenêtre).
However, law enforcement agencies claimed that the platform was almost exclusively being used by organized crime (S'ouvre dans une nouvelle fenêtre) to coordinate illegal activities, including drug trafficking and money laundering. The Ghost app operated on modified smartphones that provided end-to-end encryption, self-destruct messaging, and a closed user group that made monitoring difficult.
Screen cap of the now-defunct Ghost site (thank you Hacker News for the archive link)
The international operation to dismantle Ghost (S'ouvre dans une nouvelle fenêtre) involved law enforcement agencies from nine countries, led by Europol. It resulted in the seizure of assets and 51 arrests in Australia, Italy, Ireland, Sweden, and Canada.
According to the Australian Federal Police (AFP) and Europol, law enforcement agencies infiltrated the Ghost platform by compromising the application’s software updates. Australian police technicians, with cryptographic assistance from the French Gendarmerie and US FBI, modified the software updates regularly pushed by Jay Je Yoon Jung (S'ouvre dans une nouvelle fenêtre), the alleged creator of Ghost. This allowed them to install backdoors into the devices running Ghost, ultimately leading to the platform’s takedown in what the authorities referred to as “Operation Kraken”.
Image included with official release by the Australian Federal Police (S'ouvre dans une nouvelle fenêtre)
The backdoors implemented through the modified updates allowed authorities to collect data directly from the devices using Ghost. This included intercepting messages, video calls, and other communications that were originally believed to be secure. So… spyware?
What’s Happening With Telegram?
Telegram has been under legal pressure in various countries for several years. Germany and France have both gone after the platform for its perceived lack of compliance with local regulations (S'ouvre dans une nouvelle fenêtre) and its refusal to provide access to user data. The assertion is that Telegram’s encrypted messaging can be used to promote illegal activities such as hate speech, terrorist recruitment, and the spread of misinformation.
In early 2024, French authorities began investigating the platform’s role in facilitating extremist content, child exploitation material, and drug trafficking activities (S'ouvre dans une nouvelle fenêtre). During this time, Telegram implemented minor updates to its terms of service and moderation policies, but these changes did not satisfy regulators.
As a casual user of Telegram, this feels like a half-truth from both parties, as there is sufficiently evident criminal activity taking place on the platform in publicly available, unencrypted chatrooms all of the time. No spyware needed. It’s the de-facto platform for DDoS-for-hire, crypto scams, and illegal data sales. Anyone remotely indoctrinated into the cybersecurity industry would know that, but I’ll get more into that at the end of this piece.
What a deal!
Pavel Durov, its founder and CEO, was arrested on August 24th upon landing in Paris after a trip from Azerbaijan (S'ouvre dans une nouvelle fenêtre), a situation I’m sure many are already familiar with.
The arrest was part of a campaign by European authorities to hold tech leaders accountable for platform misuse, particularly in cases where platforms are used for illegal activities such as CSAM–mentioned above.
French officials charged him with enabling criminal activities on the platform due to Telegram’s historical lack of cooperation. Following his arrest, Durov agreed to comply with some of the French legal requests for user data, a heel-turn from his previous positions.
There is a new policy on data sharing (S'ouvre dans une nouvelle fenêtre). Telegram now complies with the divulging of user data to authorities when requested through valid judicial orders. The previous policy only allowed for disclosures in cases involving terrorism or extreme violence. The new policy seems to suggest that sharing IP addresses and phone numbers are now on the table.
The platform’s transparency around data requests remains unclear at this point. Such requests used to be able to be placed with its built-in transparency bot. Since the arrest, it has been under maintenance, obscuring compliance details.
The most stolen/re-shared photo of Durov-will link to photo credit ASAP
Cooperation like this can be easily interpreted as a concession under pressure, but there have been some substantial changes happening with Telegram in the meantime, covered extensively by 404 Media. Almost all of this info, they summarize excellently in their recent podcast episode (S'ouvre dans une nouvelle fenêtre) and multiple articles (S'ouvre dans une nouvelle fenêtre) on the matter.
Russian officials expressed concern over Durov’s arrest, viewing it as a possible attempt by French authorities to gain access to encryption keys and user data. They proposed relocating him to Russia or the UAE (famously not-corrupt, solid defenders of free speech and data sovereignty (S'ouvre dans une nouvelle fenêtre)) to avoid compromising the platform’s encrypted communications.
Following the change, many criminal groups and channels on Telegram have begun migrating to other platforms like Signal and Tor-based messaging apps to avoid data-sharing risks.
Has This Happened Before?
Remember EncroChat? Me neither.
Ghost and Telegram are not alone in facing regulatory scrutiny. Other platforms like EncroChat and Sky ECC have also been targeted and dismantled due to accusations of use by criminal networks.
EncroChat, for instance, was compromised in 2020 by law enforcement agencies (S'ouvre dans une nouvelle fenêtre), who managed to decrypt the platform’s communications. The operation was spearheaded by French and Dutch police, and resulted in nearly seven thousand arrests over the past four years. EncroChat, as a business model, was also based on a “hardened” Android platform, advertised as “privacy-focused” phones.
After EncroChat was broken up, many of its users moved to Sky ECC (S'ouvre dans une nouvelle fenêtre), an encrypted messaging platform that supported cryptocurrency payments. It was taken down in 2021 (S'ouvre dans une nouvelle fenêtre) by Europol, and was determined to be facilitating organized crime. Formal racketeering charges were brought to founder Jean-Francois Eap by the US DoJ in 2021 (S'ouvre dans une nouvelle fenêtre). Nearly 900 users were arrested within the first year of the Sky crackdown, and thousands of charges are still being litigated.
All of these faced the same scrutiny that Ghost and Telegram do. All of them were broken up in ways identical to Ghost. However, is this indicative of some authoritarian hammer coming down on our chat programs?
Platforms That Are Built Different
Several other encrypted messaging applications—like Signal, Session, and Tox—are also getting extra attention due to the growing governmental scrutiny over E2EE. Should the EU regulations turn against encryption, they’ll be nearly as much outside the law as any of the previously mentioned platforms.
Nearly as outside the law. They’ve all avoided specific legal accusations through some differences in operational principles and organizational structure that will become glaringly obvious in just a few paragraphs.
Signal has taken a proactive stance against new EU regulations, with its employees actively taking part in the campaign for protecting E2EE. That said, Signal’s infrastructure is designed to minimize data storage—collecting only basic metadata like the date of account creation and last active date—making it impossible to fulfill most legal data requests (S'ouvre dans une nouvelle fenêtre). While technically in “compliance” with currently existing laws, this is all of the information that Signal can provide when compelled via a court orders.
It should also be noted that Signal is a non-profit (S'ouvre dans une nouvelle fenêtre), making most if its money from donations and grants. There is no direct way to monetize on the app, and the company engages in no transactions with the users.
Session takes this a step further. It’s another free platform that offers a unique approach to privacy by not requiring users to link any personal identifiers such as phone numbers or email addresses. By using the Signal Protocol and Onion Routing, Session provides an additional layer of anonymity that traditional E2EE apps lack. There is nothing for Session‘s owners, the Oxen Privacy Tech Foundation (S'ouvre dans une nouvelle fenêtre), to hand over if compelled by law. The assumption is that this also makes them compliant by default.
How does Session financially stay afloat? The OPTF supports it through the Oxen cryptocurrency network and donations from users. There is no traditional funding model or money directly poured into the project, and it does not have any method of monetization.
Tox, a lesser-known but highly secure messaging protocol, operates on a decentralized peer-to-peer network, rendering it virtually impossible for any single authority to shut down or monitor.
Tox’s decentralized client, aTox, is developed in a way that lacks central servers, making it resistant to many of the traditional legal tools used to compel companies to provide data.
Since there is no server infrastructure and Tox doesn’t have the widespread adoption of Signal or WhatsApp, it has very little overhead. It operates as a completely free, open-source project (S'ouvre dans une nouvelle fenêtre). There is no way to directly monetize over Tox.
So What’s The Takeaway? (Opinion Time)
Okay. So I think there are a few layers to this that need to be unpacked.
The first and foremost, the most dangerous, the thing that should be the most pressing regarding communication platforms and the state of E2EE is that everyone concerned should be watching for the EU Council’s decision next week.
If E2EE is made illegal, or de-fanged in Europe, there will be a precedent set in liberal democracies for surveillance that I don’t think will be undone any time soon.
My fear is that this would be the first of several privacy dominoes to fall around the world. These are the stakes that everyone who is truly an advocate for digital privacy should care about. What’s very heartening is that it seems there are a number of organizations on both sides of the Atlantic (S'ouvre dans une nouvelle fenêtre) committed to this cause.
Thankfully, it seems like the worst-case scenario has been avoided for at least another year.
Virtue Signaling
Time to piss some people off.
What if I told you that you could hold multiple thoughts at the same time: that Europe’s attacks on E2EE are bad and we should fight government overreach, but that Pavel Durov and Jay Je Yoon Jung might also be bad? That they’re not deserving of our advocacy as freedom advocates?
This next point is what I think of as the “distraction”. Not the actual case against Telegram, but how Durov’s arrest is being handled in the media. Particularly right-wing media. Particularly in the US.
Give me a fucking break, dude.
The difference between platforms like Signal, Tox, or Session, and operations such as Telegram, Ghost, and the other dismantled services lies not only in their business models, but in their motivations which fundamentally drive their structures. The difference is ideological.
While these for-profit ventures claimed to offer privacy and security, their business models inevitably created opportunities for abuse—both by criminal enterprises and, indirectly, by the founders who profited from the continued use of their platforms.
No Hero
Unlike true encrypted messaging services, Telegram has operated more as a social media network, with opt-in encryption and a reliance on advertising and investor funding that fundamentally changes its relationship with user data.
Whether you fundamentally disagree with it or not, large social media platforms are required to moderate to a certain degree in North America and Europe. Moderation on Telegram is still lackluster, and has previously been non-existent. Almost every one of Telegram‘s 900 million+ user community knows that. That’s why they use it in the first place.
It seems clear to me that this as an example of “ignoring the problem while it still makes you money”. This hands-off approach, and the extremely suspicious deference from the Russian and Belarusian governments (S'ouvre dans une nouvelle fenêtre) has only made its user-base explode in its eleven-year run.
Let’s just call a spade a spade: Telegram is a platform that’s great for criminal actors to get outstanding reach on, Durov profited, and pulled the suspiciously familiar-sounding move of claiming he wouldn’t moderate it for “free speech or whatever”. It’s another grift, and it finally bit him in the ass when he got off the plane.
Maybe he once believed in something, but let’s not forget that in his previous life, he was the creator of VKontakte–Russian Facebook. He’s just the Zuckerberg of Russia.
It’s my belief that Durov’s compliance failures don’t stem from some principled place of free speech absolutism, but laziness and greed. This same flaw haunts VK‘s American counterpart (S'ouvre dans une nouvelle fenêtre). A refusal to moderate Facebook for monetary reasons enabled horrific crimes throughout the non-English speaking world.
If Telegram were truly developed as a privacy platform, all data would be end-to-end encrypted by default. Durov’s arrest resulted from his refusal to comply with information requests for data that was readily accessible within Telegram’s infrastructure. This supports critiques made by Signal’s Meredith Whittaker, who pointed out that Telegram’s inconsistent encryption standards make it vulnerable to legal and regulatory scrutiny, unlike platforms that prioritize user privacy at a foundational level.
The very nature of Signal, Tox, and Session makes it theoretically impossible for them to serve the same role as Telegram or Ghost in monitoring or profiting from illicit activities. And while there are some legal challenges to their integrity, they’ve so far managed to weather the storm.
They just had to not be greedy.
They just had to believe in something.
But you won’t hear Elon, Tucker, or Candace Owens talking about that.