A controversial European Union legislative proposal to scan the private messages of citizens in a bid to detect child sexual abuse material (CSAM) is a risk to the future of web security, Meredith Whittaker warned in a public blog post Monday. She’s the president of the not-for-profit foundation behind the end-to-end encrypted (E2EE) messaging app Signal.
“There is no way to implement such proposals in the context of end-to-end encrypted communications without fundamentally undermining encryption and creating a dangerous vulnerability in core infrastructure that would have global implications well beyond Europe,” she wrote.
The most recent European Council proposal, which was put forward in May under the Belgian presidency, includes a requirement that “providers of interpersonal communications services” (aka messaging apps) install and operate what the draft text describes as “technologies for upload moderation”, per a text published by Netzpolitik.
Last month, Euractiv reported that the revised proposal would require users of E2EE messaging apps to consent to scanning to detect CSAM. Users who did not consent would be prevented from using features that involve the sending of visual content or URLs it also reported — essentially downgrading their messaging experience to basic text and audio.
The EU’s own data protection supervisor has also voiced concern. Last year, it warned that the plan poses a direct threat to democratic values in a free and open society.
Pressure on governments to force E2EE apps to scan private messages, meanwhile, is likely coming from law enforcement.
Back in April European police chiefs put out a joint statement calling for platforms to design security systems in such a way that they can still identify illegal activity and send reports on message content to law enforcement. Their call for “technical solutions” to ensure “lawful access” to encrypted data did not specify how platforms should achieve this sleight of hand