Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

CUPERTINO, Calif. — Apple briefed academics this week about their plans to install software in iPhones sold in the U.S. to "scan for child abuse imagery."

The initiative was reported today by The Financial Times, which reported that Apple’s plans are “raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”

Apple’s proposed system, neuralMatch, would “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The Financial Times confirmed that “the scheme will initially roll out only in the U.S.”

Security researchers told the publication that while they may be “supportive of efforts to combat child abuse,” they are nevertheless “concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”

'An Absolutely Appalling Idea'

Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch “an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of  […] our phones and laptops.”

Researchers point out that “although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.”

Matthew Green, a security professor at Johns Hopkins University, warned about the expansive implications of such a technology.

“This will break the dam — governments will demand it from everyone,” Green noted.

The Financial Times described the intrusive nature of the new technology: “Apple’s neuralMatch algorithm will continuously scan photos that are stored on a U.S. user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as ‘hashing,’ will be compared with those on a database of known images of child sexual abuse.”

How the Algorithm Was Trained

Whether this private system assesses an image on a user’s computer to be legal or illegal will be based on how Apple has set up the algorithm. In this case, as reported, the system “has been trained on 200,000 sex abuse images collected by the U.S. non-profit National Center for Missing and Exploited Children.”

“According to people briefed on the plans, every photo uploaded to iCloud in the U.S. will be given a ‘safety voucher’ saying whether it is suspect or not,” the report added. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

The report does not specify what safeguards would be in place in case of a mistake or "false positive," when the algorithm identifies a piece of legal content as CSAM and law enforcement is compelled to act — or who would be liable for a life-and-reputation-destroying misidentification.

Related:  

Copyright © 2024 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

CAM4 Debuts Weekly 'Skyy Knox's CAM Crawl' Livestream

CAM4 is launching "Skyy Knox’s CAM Crawl," a new livestream running every Sunday at 3 p.m. PDT.

Texas Judge Pauses AG Ken Paxton's Aylo Lawsuit Until SCOTUS Decision

A Texas district judge granted a request Wednesday to pause proceedings in the lawsuit filed by Attorney General Ken Paxton against Aylo over its implementation of Texas’ controversial age verification requirements for Pornhub, pending the outcome of the Free Speech Coalition-led lawsuit against Paxton, which will be heard by the Supreme Court during the next term.

Author of UN Report Recommending Worldwide Criminalization of Sex Work, Porn to Speak at NCOSE Summit

Jordanian activist Reem Alsalem, a special rapporteur on violence against women and girls at the United Nations Human Rights Council who recently issued a controversial report recommending that governments abolish all forms of sex work, including porn, will speak at anti-porn lobby NCOSE’s 2024 summit in August.

Spicey AI Voice Chat Platform Launches

Spicey AI, a platform that uses artificial intelligence to create interactive voice messages from chatbots based on adult performers, has launched.

Derek Hay Sentencing Hearing: Performers Give Impact Statements

The first day of the sentencing hearing for LA Direct Models’ Derek Hay, who pleaded guilty in May to one charge of conspiracy to commit pandering and a charge of perjury, took place in Los Angeles Wednesday.

Utherverse to Host 8th Annual VirtualCon in September

Virtual reality and metaverse technology company Utherverse will hold the eighth edition of its annual virtual conference, VirtualCon, from Sept. 26-28.

Pornhub Shuts Down Access in Nebraska Over Age Verification

Aylo began blocking access to Pornhub in Nebraska on Monday, in anticipation of the state’s new age verification law — one of many such bills promoted by religious conservatives around the country — which is scheduled to go into effect Thursday.

FeelMe AI Launches 3 New Subscription Tiers

FeelMe AI has launched three new subscription levels, allowing users to connect compatible Kiiroo sex toys to their videos for interactive solo play.

CamSoda Launches AI Girlfriend Builder

CamSoda has debuted a personalized "AI girlfriend" feature, which allows users to create their very own virtual companion at no charge, including free NSFW role-play and chat.

Free Speech Organization Comes Out in Support of Wisconsin Professor Who Posted on OnlyFans

After a University of Wisconsin-La Crosse faculty tribunal recommended stripping veteran professor of communications Joe Gow of tenure last week due to Gow having unremorsefully created and appeared in adult content, a major free speech organization has come out in his support.

Show More