This blog post discusses proposed regulations concerning child sexual abuse material (CSAM). If you are affected by the issues discussed, you can find guidance via helplines listed in this document.
We are a step closer to protecting the rights of people in Europe to confidential and secure digital communication, after a committee in the European Parliament has amended the proposed new rules to prevent and combat child sexual abuse, removing provisions that would have seen indiscriminate surveillance of the content of all communications.
Protecting vulnerable children and eradicating child sexual abuse imagery are critically important objectives. However, the way that we go about that must be both effective, and proportionate. The rules originally proposed by the European Commission for the upcoming law, would have made the end-to-end encryption technology that we all rely on every single day impossible.
End-to-end encryption is a technology that ensures that, for example, when you send a WhatsApp message to another person, the message is encrypted from the moment it leaves your phone, to the moment it arrives at its destination. Only someone using the sending or receiving device can see the contents of the message. Neither Meta, which operates WhatsApp, nor your phone company, have access to the contents of your communications when they are encrypted. The technology is also used for things like online shopping transactions and digital banking. It is the reason you can have trust that the actions you carry out online won’t put you at risk of becoming a victim of cybercrime.
We have already seen examples of how technologies to scan material for CSAM can have unintended consequences. A New York Times article highlighted a case in 2021 where Google disabled a user’s account after he shared images of his son with a doctor remotely during the pandemic. The company also informed the San Francisco Police Department, which investigated the case. Despite a letter from the police department investigator stating, “no crime occurred”, the user remained locked out of an account that had provided email, a personal and professional address book, and phone service, as well as an archive of family photos documenting the first years of his son’s life.
The European Commission’s proposal would have led to the introduction of ‘client-side scanning’, in addition to the server-side scanning used in the example above. This technology would have automatically intercepted communications on each user’s device, rendering encryption useless. This would be felt in several ways.
First, it would lead to a loss of privacy. It is often said that laws should be written with the worst case in mind, so let us imagine a worst-case scenario of a government that has become less than democratic. An automated mass-surveillance system of all communications would be nothing short of a gift in this situation, endangering civil society, journalists, and ordinary citizens.
Second, it would undermine trust in the digital society. If you cannot trust that your banking app or website is secure, would you feel safe using it to transfer money or manage your bank accounts? The same goes for online shopping. With a backdoor to encryption, how confident can you be that your payment information and personal data is safe in transit between your phone or computer, and the online store?
Finally, it would put every internet user in Europe at greater risk from cybercrime. Protections that help reduce identity theft and other online scams, would be weakened, making it much easier for criminals to prey on innocent people.
And this gets to the heart of the problem. We must protect children and young people from sexual abuse and CSAM. But we need to find a way to do that without endangering the fundamental functioning of our modern digital society and without endangering our human and civil rights. So, while we oppose the indiscriminate scanning of private communications, we strongly believe that the amendments to the proposal can deliver this protection for vulnerable children with a ‘security by design’ approach including targeted monitoring of high-risk public chats, warnings around inappropriate search terms, and proactive monitoring of the public and dark web.
We have not reached the end of this process. The amendment was made in a committee of the European Parliament, and we expect to see the proposal and amendments debated in late November. CEPIS will continue to argue strongly for a proportionate and balanced approach that protects both children’s rights, and everyone’s rights to liberty and security.