Chat control – or for the long regulation on combating the sexual abuse of children (CSAR) of the EU – is back, and more dangerous than ever. Under the Danish Presidency, the EU Council is pushing ahead with a proposal that requires compulsory client-side review of any private communications, including encrypted messages. This would mean the end of confidential communication in Europe and create a monitoring system that undermines the rights of over 450 million EU citizens.
First of all, the obvious: nobody is questioning the necessity of child protection. This is an important concern and needs to be tackled more intensively by law enforcement authorities with appropriate methods that are targeted. But to destroy encryption, to infiltrate it by client-side scanning with opaque AI tools, which in the end makes everyone and everyone under constant suspicion is not protection, but mass surveillance. It is a system that is all treated as potentially criminal, while it does little to prevent actual abuse. Even the legal experts of the EU Council have admitted that such measures are illegal under EU law, violate fundamental rights and are almost certainly dumped in court.
Privacy for Military & Governments, but not for you
What makes the current draft particularly outrageous is the obvious double standard. According to current plan, military and state communications are to be exempted from chat control, while citizens and companies do not have an option to evade invasive surveillance (except they encrypt them manually, the question will be whether and when this behavior will be declared illegal). This shows what chat control really is: no measure for more security, but an instrument for control. If the encryption for ordinary citizens like you and I is weakened, it will be weakened for everyone. Nevertheless, criminals will still have the opportunity to build their own encrypted apps or encrypt them manually.
The EU Council claims that chats can be protected and scanned at the same time, but does not provide a technical explanation of how this should be possible. The truth is as simple as it is obvious: encryption cannot be certain when authorities can circumvent it at the same time. Either the communication is confidential or it is scanned by algorithms. There is no middle ground.
Would you let AI sniff around?
The Danish Presidency's proposal goes even further than previous drafts. Messaging or email providers would not only be forced to search for “known” abusive content, but would also have to use unreliable AI tools to recognize “unknown” content. This means that personal photos and private messages could be incorrectly marked, classified or even passed on.
While more and more people want to protect their data from invasive AI tools with big tech, EU policymakers are now pursuing an opposite plan. They want to give opaque algorithms access to any communication that takes place in Europe – except, of course, it is their own communication or that of the military.
The fight continues
As citizens and businesses in the EU, we say quite clearly that this must not be the case. At Tuta, we insist on protecting secure encryption. We need to remind decision-makers that undermine encryption, weakens security for everyone, from journalists to companies, from families to activists.
Chat control is the most criticised law in the history of the EU for good reason: it is a master plan for mass surveillance that Orwell could not have written better. It infiltrates encryption and violates the human right to privacy. We at Tuta will never accept it.
Privacy is a fundamental right, and encryption is the basis for this.