HomeCyberSecurity NewsSignal Foundation cautions against EU's proposal to scan private messages for CSAM

Signal Foundation cautions against EU’s proposal to scan private messages for CSAM

A contentious proposal from the European Union involves scanning users’ private messages to detect child sexual abuse material (CSAM), which Signal Foundation President Meredith Whittaker cautioned poses significant risks to end-to-end encryption (E2EE) used by the privacy-focused messaging service.

“Mandating mass scanning of private communications fundamentally undermines encryption. Full Stop,” Whittaker stated in a release on Monday.

“Whether this occurs through tampering with encryption algorithms, implementing key escrow systems, or requiring communications to pass through surveillance systems before encryption.”

European legislators are introducing regulations to combat CSAM with a new provision called “upload moderation” that allows messages to be examined before encryption.

An article from Euractiv disclosed that audio communications are not covered by the law, and users must agree to this detection in the service provider’s terms and conditions.

“Those who decline consent can still access parts of the service that don’t involve sharing visual content and URLs,” it additionally noted.

Europol, in late April 2024, urged the tech industry and governments to prioritize public safety, warning that security measures like E2EE could hinder law enforcement from accessing problematic content, sparking debate on balancing privacy and combatting serious crimes.

It also emphasized the need for platforms to create security systems that can identify and report harmful and illegal activities to law enforcement without revealing implementation details.

Apple previously announced plans to introduce client-side monitoring for CSAM but abandoned it in late 2022 due to backlash from privacy and security advocates.

“Scanning for one type of content could lead to mass surveillance and a desire to monitor other encrypted messaging systems for various content types,” the company stated then, explaining its decision. It described the process as creating a “slippery slope of unintended consequences.”

Whittaker of Signal further criticized labeling the approach “upload moderation” as a deceptive tactic akin to creating a security vulnerability, opening doors for exploitation by malicious actors and nation-state hackers.

“Either end-to-end encryption protects everyone, ensuring security and privacy, or it fails for everyone,” she argued. “Weakening end-to-end encryption, especially at such a politically delicate time, is a detrimental move.”

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest News