Digital Rights Groups Lambast European Commission's Online Surveillance Proposal

Digital Rights Groups Lambast European Commission's Online Surveillance Proposal

BRUSSELS — Digital rights and privacy advocates are raising the alarm about an EU legislative proposal unveiled yesterday by the European Commission, allegedly to address “misuse of online services” and “prevent and combat child sexual abuse online.”

In a statement, the Brussels-based European Commission justified its plan for addressing the issue of CSAM by contending that “the current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires.”

The proposed rules, the EC notes, “will oblige providers to detect, report and remove CSAM on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”

The proposal creates a new continental bureaucracy, the EU Centre on Child Sexual Abuse, granted powers of surveillance to “facilitate the efforts of service providers by acting as a hub of expertise, providing reliable information on identified material, receiving and analyzing reports from providers to identify erroneous reports and prevent them from reaching law enforcement, swiftly forwarding relevant reports for law enforcement action and by providing support to victims.”

A New Euro Bureaucracy, With Online Surveillance Powers

This new EU Centre bureaucracy will be the clearing house for what the proposal calls “detection orders.”

These “detection orders,” the EC explained, “will be issued by courts or independent national authorities. To minimize the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.”

According to the intricate system for content review and moderation, the proposal mandates:

  • Providers of hosting or interpersonal communication services “will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.”
  • Then, each country in the EU “will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new CSAM or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.”
  • When a website or platform receives one of these detection orders, they “will only be able to detect content using indicators of CSAM verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting CSAM. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.”
  • Providers that have detected CSAM online “will have to report it to the EU Centre.”
  • The new laws should enable national authorities to “issue removal orders if the CSAM is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.”
  • The new laws will also mandate app stores "ensure that children cannot download apps that may expose them to a high risk of solicitation of children."
  • The transnational, politically appointed EU Centre will monitor online service providers, and determine if they are indeed “complying with their new obligations to carry out risk assessments, detect, report, remove and disable access to CSAM online, by providing indicators to detect CSAM and receiving the reports from the providers.”
  • The EU Centre will have vague competencies, interacting with “national law enforcement and Europol, by reviewing the reports from the providers to ensure that they are not submitted in error, and channelling them quickly to law enforcement.”

The document concludes by claiming that it is part of a new European strategy “for a better internet for kids.”

'A Worrying Day' for EU Privacy

The advocacy group European Digital Rights released a statement yesterday lambasting the European Commission’s online CSAM proposal for “failing to find right solutions to tackle child sexual abuse.”

“Today is a worrying day for every person in the EU who wants to send a message privately without exposing their personal information, like chats and photos, to private companies and governments,” EDRi stated, adding that the European Commission’s proposal includes “measures which put the vital integrity of secure communications at risk.”

The group criticized Commissioner Ylva Johansson, who has been behind the continental push for increased surveillance, for spearheading a proposal "which could still force companies to turn our digital devices into potential pieces of spyware, opening the door for a vast range of authoritarian surveillance tactics."

The proposal, the statement continued, puts “journalists, whistleblowers, civil rights defenders, lawyers, doctors and others who need to maintain the confidentiality of their communications at risk.”

Tech news site TechDirt has called the EC proposal “Europe’s own version of the EARN IT Act,” likening it to the controversial bipartisan U.S. proposal to limit Section 230 protection and establish a new state bureaucracy to monitor and surveil internet content, purportedly for the purpose of battling supposed “online harms.”

Copyright © 2024 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

FSC Drops Opposition to California Age Verification Bill After Amendments

Free Speech Coalition (FSC) has dropped its formal opposition to California’s age verification bill AB 3080, after an amendment secured through months of discussions with the bill’s author was heard by the Senate Judiciary Committee.

SCOTUS Agrees to Hear Texas Age Verification Challenge

The United States Supreme Court granted on Tuesday the petition for a writ of certiorari in the Free Speech Coalition-led challenge to Texas’ age verification law, agreeing to hear the case in the next term.

Dorcel Group Acquires LifeSelector

Dorcel Group has acquired interactive content company LifeSelector.

Etsy Updates Policy to Ban Sale of Most Adult Pleasure Products, Content

Etsy will ban sales of most pleasure products and content that depicts sex acts and genitalia starting July 29.

Jamie Page Is LoyalFans' 'Featured Creator' for July

LoyalFans has named Jamie Page as its Featured Creator for July.

Stripper, Adult Businesses Challenge Florida's Under-21 Ban for Adult Entertainment Workers

Strip clubs and other adult entertainment establishments in Florida are challenging the state’s law that prevents them from employing adults between the ages of 18 and 20.

Byborg's Le Shaw Research Institute Teams Up With SWOP Behind Bars

LiveJasmin parent company Byborg Enterprises’ Le Shaw International Sexual Health and Wellness Research Institute has joined forces with U.S.-based sex worker advocacy group SWOP Behind Bars.

AI Erotic Storytelling Platform 'Erota' Launches

Erota, a new AI-powered erotic storytelling platform, has debuted.

Indiana Court Blocks Age Verification Law

A U.S. district court in Indiana has blocked the state's age verification law from taking effect this coming Monday, July 1.

Sex Worker Rights Advocates Speak at UN Criticizing Stigmatizing Report

Several sex worker rights organizations and advocates provided input this week at the United Nations office in Geneva, addressing a recent controversial report by the Human Rights Council’s special rapporteur on violence against women and girls, which made broad claims about sex work and adult content, and also endorsed different forms of criminalization.

Show More