Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks
Unmasking the plan to restrict encryption and private messaging in Europe

Unmasking the plan to restrict encryption and private messaging in Europe

Can European law enforcement agencies force messaging services to crack open the contents of their users' communications to scan for and retain illegal material?

Yaël Ossowski profile image
by Yaël Ossowski

Can European law enforcement agencies force messaging services to crack open the contents of their users' communications to scan for and retain illegal material?


Put as a broad question, the vast majority of Europeans would reject this outright. This certainly applies to young people who are more likely to be the users of these services. Privacy is an important principle that most EU citizens value and cherish.

We've now come to expect privacy not only in our financial matters, and how it relates to government, but only now our personal communications devices and services. Most messaging apps now come with end-to-end encryption as a default, something that gives peace of mind to their many users.

Governments, however, want to crack down.

What if the illegal material is images and depictions of child sexual abuse, terroristic activity, or other unsavory and disgusting content?

This is the impetus behind a long-sought EU regulation that many fear would restrict encryption and private messaging on the continent.

Officially, this regulation is called Proposal for a regulation laying down the rules to prevent and combat child sexual abuse but privacy activists have appropriately dubbed it "Chat Control".

The CSAM debate is not new in the EU nor in other countries.

As the vastness of the Internet has now been made constantly searchable and accessible on our personal electronic devices, many fear that this content will find a much larger audience. And regulators have perked up at the chance.

Beginning in 2025 in the United Kingdom, users wanting access to explicit material on adult websites will need to upload their identity document proving their age.

Just as recently as January, the major US tech CEOs were grilled intensely by the Senate Judiciary Committee on their various policies to crack down on CSAM, hinting at a legislative solution that would force companies to scan material shared in chats for illegal content.

In the EU, buoyed by various NGOs keeping up pressure, commissioners have sought to put forward this chat control proposal that allow police agencies in member states to force some level of scanning of the content shared between users.

Where does it stand now? First, we must go down the EU rabbit hole.

The process of lawmaking at the supranational level of the European Union is a complex labyrinth of overlapping and competing committees, institutions, and authorities that must balance member state jurisdiction, treaty and charter obligations, and the desire to harmonize a European Single Market for goods, capital, services, and people.

The European Commission issues a regulation, various committees provide their perspective, then it is shuffled off to both the European Council, where Member States have their say, as well as the European Parliament. Both of these institutions must provide input and amendments, with the Commission weighing in where necessary, and then finally a "trilogue" is called to reconcile all versions of the regulation for final adoption.

The number of people who can fully understand, grasp, and influence these proceedings is rather limited, almost by design. Lobbyists and government affairs professionals are paid handsome sums to understand and follow this. Apart from various written consultations, citizens have only their MEPs to rely on for input, while NGOs are restricted to only certain opportunities for direct engagement (much more on this will be explored in EU Tech Loop).

Most of the commentary on Chat Control has come from privacy activist and dissident MEPs, including most notably German Pirate Party MEP Patrick Breyer.

According to the most recent leaks, the difference with this new CSAM scanning proposal is that is labels various services with a risk calculation, similar to the AI Act and other tech proposals popular in Brussels.

The highest risk, likely applied to encrypted messaging services or those that do not collect user data, means there will be much more emphasis on allowing for scanning and screening of illegal content. Also think of VPNs or services that use cryptocurrencies outside the main banking system.

The lowest risk, perhaps unencrypted email services, will likely have the least amount of compliance to contend with.

Regardless of this risk classification, the Chat Control regulation continues to impose that various services will have "detection obligations," which are about as vague as they sound. Especially since they claim encryption won't be harmed:

As quoted in the regulation, "providers should remain free to offer services using end-to-end encryption and should not be obliged by this Regulation to decrypt or create access to encrypted data."

This Regulation should limit the exercise of the rights and obligations provided for in Articles 5(1), (3) and 6(1) of Directive 2002/58/EC, insofar as strictly necessary to execute detection orders issued in accordance with this Regulation with a view to prevent and combat online child sexual abuse.

How this regulation will be used, as put forward in the proposal, is via "detection orders" that police services will be able to issue, regardless of where users or services are located:

Those rules should apply to all providers, irrespective of their place of establishment or residence, that offer services in the Union.

What will that mean for your ability to communicate freely and privately on your favorite messaging app or email service? In typical EU fashion, the innocuous "detection orders" will be applied like a warrant against various services, supposedly against higher risk applications when there is a tip about some kind of illegal activity.

Practically, understanding how encryption works, this will be virtually impossible. Without segmenting parts of messaging or email services, forcing detection orders will mean breaking encryption code at the request of police. Authorities have always had the ability to receive warrants for metadata and information, but this proposal would require a protocol change to allow for future "detection" if required.

It is difficult to see how this aligns with the EU's own treaties and charters.

Article 10 of the European Convention on Human Rights stipulates that expression must be free and not hampered by government authorities:

Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.

Would a mechanism to allow the scanning of your messaging app communications constitute an "interference" by public authorities? Obviously it would.

Thankfully, companies like Apple have implemented this from the beginning on their iMessage service, and Meta has also now implemented E2E encryption into both WhatsApp and Messenger.

At the same time, there are a number of messaging apps and platforms that only offer encrypted communications, including Signal, Simplex, Threema, Sessions, and much more. Encrypted email services like ProtonMail and Tuta Mail are growing more popular, and publicise when they receive government requests for data.

As I mentioned to Forbes, this proposal pays a lot of lip service to the importance of privacy, but ultimately, it green lights the ability for national legislatures to clamp down on encrypted messaging. Regardless of the intent, this remains concerning.

Publications such as Public have done an excellent job in distilling the ramifications of Chat Control, but commentary elsewhere in the EU has been rather limited.

Police agencies in Europe have the means at their disposal to crack down on illegal content and behavior, and they should be empowered to do so.

However, that cannot come at the expense of our own privacy, and our ability to seek out privacy-enhancing technologies that have become vital to our way of life, our relationships and commerce, and how we safeguard our individual rights.

Encryption is a powerful tool that should be both encouraged and protected at all levels of our European institutions. We hope EU regulators come to understand this as well.

Yaël Ossowski profile image
by Yaël Ossowski

Subscribe to Tech Loop

Stay in the loop. Get the latest updates and articles.

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More