Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

CUPERTINO, Calif. — Apple briefed academics this week about their plans to install software in iPhones sold in the U.S. to "scan for child abuse imagery."

The initiative was reported today by The Financial Times, which reported that Apple’s plans are “raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”

Apple’s proposed system, neuralMatch, would “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The Financial Times confirmed that “the scheme will initially roll out only in the U.S.”

Security researchers told the publication that while they may be “supportive of efforts to combat child abuse,” they are nevertheless “concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”

'An Absolutely Appalling Idea'

Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch “an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of  […] our phones and laptops.”

Researchers point out that “although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.”

Matthew Green, a security professor at Johns Hopkins University, warned about the expansive implications of such a technology.

“This will break the dam — governments will demand it from everyone,” Green noted.

The Financial Times described the intrusive nature of the new technology: “Apple’s neuralMatch algorithm will continuously scan photos that are stored on a U.S. user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as ‘hashing,’ will be compared with those on a database of known images of child sexual abuse.”

How the Algorithm Was Trained

Whether this private system assesses an image on a user’s computer to be legal or illegal will be based on how Apple has set up the algorithm. In this case, as reported, the system “has been trained on 200,000 sex abuse images collected by the U.S. non-profit National Center for Missing and Exploited Children.”

“According to people briefed on the plans, every photo uploaded to iCloud in the U.S. will be given a ‘safety voucher’ saying whether it is suspect or not,” the report added. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

The report does not specify what safeguards would be in place in case of a mistake or "false positive," when the algorithm identifies a piece of legal content as CSAM and law enforcement is compelled to act — or who would be liable for a life-and-reputation-destroying misidentification.

Related:  

Copyright © 2024 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Woodhull Urges the Supreme Court to Find Texas AV Law Unconstitutional

The Woodhull Freedom Foundation and the Electronic Frontier Foundation submitted a brief to the United States Supreme Court on Thursday, urging the justices to rule against Texas’ age verification law.

AEBN Publishes Popular Searches for March and April

AEBN has released the top search terms for the months of  March and April from its straight and gay theaters in all 50 states and the District of Columbia.

2024 XBIZ Creator Awards Winners Announced

Winners of the 2024 XBIZ Creator Awards were revealed Wednesday evening during a live ceremony at E11EVEN Nightclub in Miami, Florida. The event, presented by Fansly, was hosted by Siri Dahl and Little Puck.

'90s Japanese Performer Sues to Remove Titles from Streaming Site

Former Japanese performer Miyuki Ariga is suing the Fanza adult streaming site at the Tokyo District Court to remove four titles in which she appeared in 1994.

Free Speech Coalition Asks Court to Block Montana AV Law

The Free Speech Coalition (FSC) has asked the US District Court of Montana to block the state's new age verification law.

Segpay Launches Virtual 'Segcard' Creator Payout Solution

Segpay has updated its Segcard creator payout option by offering a new, virtual version.

Leading Conservative Think Tank Slams 5th Circuit for Upholding Texas Age Verification Law

Leading conservative think tank the American Enterprise Institute has published an opinion piece penned by one of its senior fellows criticizing the 5th Circuit endorsement of Texas’ controversial age verification law.

OpenAI Shuts Down AI-Generated Porn Rumors

A spokesperson for OpenAI, the company behind ChatGPT, has shut down online chatter about how a rumored relaxation of the company’s stance against AI-generated NSFW content may result in a lifting of its porn ban.

9th Circuit Upholds Verdict Against Oregon College for Discriminating Against Former Adult Performer

The 9th U.S. Circuit Court of Appeals on Monday upheld a 2022 Oregon jury’s verdict in favor of Nicole Gililland, a former nursing student who sued her school for discriminating against her because of her adult performer past.

Former Trump Staffer, Project 2025 Advisor John McEntee Predicts a Total Porn Ban

John McEntee, senior advisor to the Heritage Foundation’s Project 2025 and a former key figure in the Trump administration, is predicting an eventual full ban on pornography, claiming that once it is enacted, “this country will flourish.”

Show More