Digital Rights Groups Lambast European Commission's Online Surveillance Proposal

Digital Rights Groups Lambast European Commission's Online Surveillance Proposal

BRUSSELS — Digital rights and privacy advocates are raising the alarm about an EU legislative proposal unveiled yesterday by the European Commission, allegedly to address “misuse of online services” and “prevent and combat child sexual abuse online.”

In a statement, the Brussels-based European Commission justified its plan for addressing the issue of CSAM by contending that “the current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires.”

The proposed rules, the EC notes, “will oblige providers to detect, report and remove CSAM on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”

The proposal creates a new continental bureaucracy, the EU Centre on Child Sexual Abuse, granted powers of surveillance to “facilitate the efforts of service providers by acting as a hub of expertise, providing reliable information on identified material, receiving and analyzing reports from providers to identify erroneous reports and prevent them from reaching law enforcement, swiftly forwarding relevant reports for law enforcement action and by providing support to victims.”

A New Euro Bureaucracy, With Online Surveillance Powers

This new EU Centre bureaucracy will be the clearing house for what the proposal calls “detection orders.”

These “detection orders,” the EC explained, “will be issued by courts or independent national authorities. To minimize the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.”

According to the intricate system for content review and moderation, the proposal mandates:

  • Providers of hosting or interpersonal communication services “will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.”
  • Then, each country in the EU “will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new CSAM or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.”
  • When a website or platform receives one of these detection orders, they “will only be able to detect content using indicators of CSAM verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting CSAM. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.”
  • Providers that have detected CSAM online “will have to report it to the EU Centre.”
  • The new laws should enable national authorities to “issue removal orders if the CSAM is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.”
  • The new laws will also mandate app stores "ensure that children cannot download apps that may expose them to a high risk of solicitation of children."
  • The transnational, politically appointed EU Centre will monitor online service providers, and determine if they are indeed “complying with their new obligations to carry out risk assessments, detect, report, remove and disable access to CSAM online, by providing indicators to detect CSAM and receiving the reports from the providers.”
  • The EU Centre will have vague competencies, interacting with “national law enforcement and Europol, by reviewing the reports from the providers to ensure that they are not submitted in error, and channelling them quickly to law enforcement.”

The document concludes by claiming that it is part of a new European strategy “for a better internet for kids.”

'A Worrying Day' for EU Privacy

The advocacy group European Digital Rights released a statement yesterday lambasting the European Commission’s online CSAM proposal for “failing to find right solutions to tackle child sexual abuse.”

“Today is a worrying day for every person in the EU who wants to send a message privately without exposing their personal information, like chats and photos, to private companies and governments,” EDRi stated, adding that the European Commission’s proposal includes “measures which put the vital integrity of secure communications at risk.”

The group criticized Commissioner Ylva Johansson, who has been behind the continental push for increased surveillance, for spearheading a proposal "which could still force companies to turn our digital devices into potential pieces of spyware, opening the door for a vast range of authoritarian surveillance tactics."

The proposal, the statement continued, puts “journalists, whistleblowers, civil rights defenders, lawyers, doctors and others who need to maintain the confidentiality of their communications at risk.”

Tech news site TechDirt has called the EC proposal “Europe’s own version of the EARN IT Act,” likening it to the controversial bipartisan U.S. proposal to limit Section 230 protection and establish a new state bureaucracy to monitor and surveil internet content, purportedly for the purpose of battling supposed “online harms.”

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Supreme Court Rules Against Adult Industry in Pivotal Texas AV Case

The U.S. Supreme Court on Friday issued its decision in Free Speech Coalition v. Paxton, striking a blow against the online adult industry by ruling in support of Texas’ controversial age verification law, HB 1181.

North Carolina Passes Extreme Bill Targeting Adult Sites

The North Carolina state legislature this week ratified a bill that would impose new regulations that industry observers have warned could push adult websites and platforms to ban most adult creators and content.

Supreme Court Ruling Due Friday in FSC v. Paxton AV Case

The U.S. Supreme Court will rule on Friday in Free Speech Coalition v. Paxton, the adult industry trade association's challenge to Texas’ controversial age verification law, HB 1181.

Ofcom: More Porn Providers Commit to Age Assurance Measures

A number of adult content providers operating in the U.K. have confirmed that they plan to introduce age checks in compliance with the Online Safety Act by the July 25 deadline, according to U.K. media regulator Ofcom.

Aylo Says It Will Comply With UK Age Assurance Requirements

Tech and media company Aylo, which owns various adult properties including Pornhub, YouPorn and Redtube, plans to introduce age assurance methods in the United Kingdom that satisfy government rules under the Online Safety Act, the company has announced.

Kyrgyzstan Parliament Approves Measure Outlawing Internet Porn

The Supreme Council of Kyrgyzstan on Wednesday passed legislation outlawing online adult content in the country.

Trial Set for Lawsuit by U Wisconsin Professor Fired Over Adult Content

A trial date of June 22, 2026, has been set for the civil lawsuit filed by veteran communications professor Joe Gow against the University of Wisconsin board of regents, which fired him for creating and appearing in adult content.

New UK Task Force Meets to Target Adult Content

The architect of an influential report that recommended banning adult content deemed “degrading, violent and misogynistic” has convened an “Independent Pornography Review task force” aimed at translating that report’s findings into action in the U.K.

11:11 Creations Launches Affiliate Program

11:11 Creations principal Alicia Silver has launched 11:11 Cash for creators and affiliates.

Pineapple Support, Pornhub to Host 'Self Love' Support Group

Pineapple Support and Pornhub are hosting a free online support group for performers to develop self-love.

Show More