Would we accept a postman reading all our letters (or WhatsApp messages) to prevent crime? This debate will be decided in the coming weeks. The European Union, under the Spanish presidency, is debating a new regulation to prevent the spread of images of child sexual abuse.
The proposal has been the most criticized technology standard of the last decade: from Edward Snowden to the European Data Protection Supervisor, to the Council of Europe Commissioner for Human Rights Dunja Mijatovic and hundreds of academics, the vast majority of technology and human rights experts see the proposal as a threat to the confidentiality of communications and thus freedom of expression, association and movement and more generally as a threat to our independence.
The proponents of this law, particularly Home Affairs Commissioner Ylva Johansson and the rest of the directorate she leads within the European Commission, focus the debate on an alleged dichotomy between the protection of children and the protection of other human rights. This aspect hides from public opinion a fundamental question: is this proposal sufficient to protect children? Only if the answer is “yes” does the debate that has been initiated make sense.
The first question is: Can the objective of the proposal be achieved as stated? Does the technology exist to properly implement this policy? From a technical perspective, the answer is simple: no. Current detection techniques are not as accurate as would be required to detect child exploitation and abuse material (MESI): they would either miss a large amount of MESI or report a large number of false positives (material incorrectly labeled as MESI is). . Given the volume of material to be scanned, processing these false positives would require a significant amount of work that is not currently available – or result in a large number of false accusations.
But the problem doesn’t end there. Not only are these techniques ineffective, but they are also incredibly easy to circumvent. Several articles show that very simple manipulations that do not alter the appearance to the human eye can cause detectors not to display true MESI; or label harmless material as MESI. Therefore, those who want to spread MESI will continue to do so with impunity while the rest of citizens have all their content scanned for no profit. There is therefore a risk that young people who, for example, share consensual sexual content will see their most intimate photos fall into the hands of the police, Europol and an official of the future European agency that will deal with this matter and have access to them . Or that parents who find out about their children’s illnesses are accused of terrible crimes with the possible consequences.
From a technical perspective, there is no guarantee that the regulation will have a positive effect. Although scientists and members of civil society have so often demanded that their advocates provide evidence of alleged benefits, we have seen none; On the contrary, there is ample evidence that these benefits cannot be achieved.
The second question is: is the measure proportional? Can this law be implemented without risking significant harm to the fundamental rights of all people, including the children it is intended to protect? Unfortunately, the answer is once again no.
If implemented, the proposal will break any guarantee of confidentiality that encryption currently offers. Commissioner Johansson and the artificial intelligence companies supporting the proposal (such as Thorn or Microsoft) say there is no risk, ignoring the definition of confidentiality. Say this proposed law should scan everyone If electronic communications have no impact on confidentiality and data protection, it is like saying that reading a letter before putting it in the envelope has no impact on the confidentiality of analog communications (for example, an envelope, who protects our correspondence).
In this context, it is worth highlighting the main role played by the lobbies of artificial intelligence companies, revealed following the investigation recently published by independent media, revealing the de facto connection between private interests, Europol and certain political actors that prohibit the confidentiality of communications.
The other point that has not yet been answered is how to ensure that the scanning functions can only be used for MESI. Again, the reality is that this is not technically possible. The algorithms only check whether bits look like other bits. You can’t decide whether it’s MESI or not. At the moment there is only a commitment that it will not be expanded and examples from the past that suggest this is likely to happen, including revelations that Europol, for example, has already requested access to all data (illegal or otherwise), which this could be levied as a result of the application of this law.
Who supports such an invasive rule? This is not the case for a large proportion of children, young people and even many survivors of sexual abuse. According to a European study, around 80% of children in the EU say they would not feel comfortable engaging in politics or exploring their sexuality if they knew their communications were constantly being spied on. The same study shows that two thirds of young Europeans use applications such as WhatsApp, Telegram or Signal that use encryption, and that the same number do not believe that their conversations should be read in advance.
Abuse survivors also do not support the draft regulation: Alexander Hanff, an activist who has been a victim of sexual abuse, warns that Commissioner Johansson’s proposed law will leave survivors feeling unprotected when seeking support from the authorities. Another sexual abuse victim, Marcel Schneider, has sued Facebook for reading his private messages and thus removing confidentiality for abuse victims. Not even the police are convinced: Both the FBI in the US and police officials in the Netherlands and Germany have warned that the system will produce more reports, many of them false positives, making it harder to find criminals and Protect victims. Regrettably, the Spanish government is sticking to its proposal to prohibit the confidentiality of communications by banning encryption, unexpectedly joining the Spanish People’s Party MEP in leading the discussion in the European Parliament, Javier Zarzalejos.
Ultimately, the proposal in its current form does not guarantee any improvement and represents a threat to our democracy. Children undoubtedly need to be protected. The future of our society depends on them. But it must be done effectively and safely. The Commission’s proposal is neither one nor the other.
Carmela Troncoso She is a researcher at the Federal Polytechnic of Lausanne and leader of the team of scientists that developed the data protection protocol for Covid tracking applications.
Diego Naranjo He is responsible for the public policy of the NGO in favor of digital rights EDRi.
You can follow EL PAÍS technology on Facebook and X or sign up here to receive our weekly newsletter.
Subscribe to continue reading
Read without limits