X's Porn-Recognition AI Survives Illinois' Biometric Privacy Law Challenge

X's Porn-Recognition AI Survives Illinois' Biometric Privacy Law Challenge

CHICAGO — A lawsuit filed under Illinois’ unusual state law regulating biometric data collection has taken aim at an AI program used by X since 2015 to identify nudity and NSFW images.

The proposed class action suit, filed by lawyers on behalf of Chicago resident Mark Martell in 2023, alleged that X — the platform formerly known as Twitter — was in violation of the Illinois Biometric Information Privacy Act, commonly known as BIPA.

As XBIZ reported, plaintiff lawyers filing BIPA suits in Illinois have been known to target big tech companies such as Google, OnlyFans, Shutterfly and others. During one such action, Facebook agreed to pay a $650 million settlement over BIPA issues.

Martell’s complaint states that “since approximately 2015, Twitter has implemented software to police pornographic and other not-safe-for-work (‘NSFW’) images uploaded to the site. NSFW images are then ‘tagged’ by Twitter as such, preventing them from being viewed by people who do not wish to view them.”

The complaint then alleges that, in analyzing each uploaded image to determine whether “it contains nudity (or any other qualities that Twitter deems objectionable),” the platform “actively collects, captures and/or otherwise obtains; stores; and/or makes use of the biometric identifiers and biometric information of any individual included in each photo.”

On Thursday, U.S. District Court Judge Sunil R. Harjani threw out Martell’s lawsuit, but left the door open for the plaintiff to file an amended complaint by June 27.

At the core of Martell’s lawsuit is X’s usage of the third-party software PhotoDNA, developed by Microsoft. The lawsuit also mentions a 2015 article by Wired magazine that refers to an in-house AI technology developed by Twitter in 2014 after it acquired Madbits, a pioneering AI startup founded by NYU researcher Clément Farabet.

“When Farabet and his MadBits crew joined Twitter last summer, Alex Roetter — the company’s head of engineering — told them to build a system that could automatically identify NSFW images on its popular social network,” Wired reported at the time. “A year later, that AI is in place.”

Martell alleges that X’s AI scans images “without first making certain disclosures and securing written, informed permission from Illinois users posting photos” as required under Illinois’ BIPA, Law 360 reported.

Harjani ruled, however, that PhotoDNA’s creation of a “hash,” or unique digital signature, of one of Martell’s images does not amount to “a scan of his facial geometry in violation of BIPA,” the report added.

The judge defined “biometric identifier” for BIPA purposes as “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry,” thus excluding photos.

“While plaintiff alleged that PhotoDNA scanned the photo to create a unique hash, plaintiff did not allege facts indicating that the hash is a scan of face geometry, as opposed to merely a record of the photo,” Harjani wrote. “Plaintiff’s allegations leave open the question of whether the hash is a unique representation of the entire photo or specific to the faces of the people in the picture.”

Industry attorney Corey Silverstein of Silverstein Legal told XBIZ, “I absolutely love this decision. Judge Harjani did an incredible job analyzing a very tricky issue. BIPA has turned into a goldmine for class action plaintiffs and their attorneys, and in my opinion the abuse of these types of claims has gotten completely out of hand. I hope that this will lead prospective plaintiffs and their counsel to pause before filing baseless lawsuits.

“The plaintiff’s main contention is that making hash values of photos violates BIPA,” Silverstein explained. “The plaintiff’s argument seems to be that services using PhotoDNA must hash all photos, including images showing people’s faces in their database, to check if those photos match a hash value in the PhotoDNA database. The plaintiff seemingly argues that hashing the photo necessarily calculates the photo subjects’ face geometries. The court wasn’t buying what the plaintiff was selling, although the judge did grant leave to amend, meaning the plaintiff could amend its allegations and take another crack at its claim.”

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

UPDATED: Supreme Court Rules Against Adult Industry in Pivotal Texas AV Case

The U.S. Supreme Court on Friday issued its decision in Free Speech Coalition v. Paxton, striking a blow against the online adult industry by ruling in support of Texas’ controversial age verification law, HB 1181.

North Carolina Passes Extreme Bill Targeting Adult Sites

The North Carolina state legislature this week ratified a bill that would impose new regulations that industry observers have warned could push adult websites and platforms to ban most adult creators and content.

Supreme Court Ruling Due Friday in FSC v. Paxton AV Case

The U.S. Supreme Court will rule on Friday in Free Speech Coalition v. Paxton, the adult industry trade association's challenge to Texas’ controversial age verification law, HB 1181.

Ofcom: More Porn Providers Commit to Age Assurance Measures

A number of adult content providers operating in the U.K. have confirmed that they plan to introduce age checks in compliance with the Online Safety Act by the July 25 deadline, according to U.K. media regulator Ofcom.

Aylo Says It Will Comply With UK Age Assurance Requirements

Tech and media company Aylo, which owns various adult properties including Pornhub, YouPorn and Redtube, plans to introduce age assurance methods in the United Kingdom that satisfy government rules under the Online Safety Act, the company has announced.

Kyrgyzstan Parliament Approves Measure Outlawing Internet Porn

The Supreme Council of Kyrgyzstan on Wednesday passed legislation outlawing online adult content in the country.

Trial Set for Lawsuit by U Wisconsin Professor Fired Over Adult Content

A trial date of June 22, 2026, has been set for the civil lawsuit filed by veteran communications professor Joe Gow against the University of Wisconsin board of regents, which fired him for creating and appearing in adult content.

New UK Task Force Meets to Target Adult Content

The architect of an influential report that recommended banning adult content deemed “degrading, violent and misogynistic” has convened an “Independent Pornography Review task force” aimed at translating that report’s findings into action in the U.K.

11:11 Creations Launches Affiliate Program

11:11 Creations principal Alicia Silver has launched 11:11 Cash for creators and affiliates.

Pineapple Support, Pornhub to Host 'Self Love' Support Group

Pineapple Support and Pornhub are hosting a free online support group for performers to develop self-love.

Show More