SAN FRANCISCO — San Francisco City Attorney David Chiu is suing 16 companies based in the U.S. and abroad, alleging they create and distribute nonconsensual AI-generated pornography.
Through a statement released Thursday, Chiu’s office called the lawsuit, filed on behalf of the state of California, “a first-of-its-kind.” The suit alleges violations of state and federal laws prohibiting deepfake pornography, revenge pornography and CSAM, as well as violations of California’s Unfair Competition Law.
The suit, the statement notes, seeks “the removal of Defendants’ websites as well as injunctive relief to permanently restrain Defendants from engaging in this unlawful conduct. The lawsuit also seeks civil penalties and costs for bringing the lawsuit.”
The statement alleges that “the proliferation of non consensual deepfake pornographic images has exploited real women and girls across the globe,” prompting the lawsuit against what the City Attorney calls “the owners of 16 of the most-visited websites that invite users to create non consensual nude images of women and girls.”
“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” said Chiu. “Generative AI has enormous promise, but as with all new technologies, there are unintended consequences and criminals seeking to exploit the new technology. We have to be very clear that this is not innovation — this is sexual abuse. This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible.”
The City Attorney’s Office alleges that “bad actors” behind these deepfakes sites have “impacted women and girls in California and beyond, causing immeasurable harm to everyone from Taylor Swift and Hollywood celebrities to high school and middle school students.”
Deepfakes, according to the statement, “are used to extort, bully, threaten, and humiliate women and girls.”
The 16 targeted websites, the statement alleges, “offer to ‘undress’ images of women and girls. These websites offer user-friendly interfaces for uploading clothed images of real people to generate realistic pornographic versions of those images. These websites require users to subscribe or pay to generate nude images, profiting off of nonconsensual pornographic images of children and adults. Collectively, these websites have been visited over 200 million times just in the first six months of 2024.”
Chiu’s statement does not mention nonconsensual, AI-generated pornography targeting men.
A Tangled Web of Obscure Sites and Companies
Chiu’s office attached the complaint — filed at San Francisco Superior Court as “People of the State of California v. Sol Ecom, Inc, et al.” — but redacted the names of the websites. However, the unredacted version of the complaint names the targeted websites as: Drawnudes.io (operated by Sol Ecom); Porngen.art and Undresser.ai (operated by Briver); Undress.app, Undress.love, Undress.cc, and Ai-nudes.app (operated by Itai Tech); Nudify.online (operated by Defirex); Undressing.io (operated by Itai OÜ); Undressai.com (operated by Gribinets); Deep-nude.ai (operated by someone only identified as Doe #1); Pornx.ai (operated by someone only identified as Doe #2); Deepnude.cc (operated by someone only identified as Doe #3); Ainude.ai (operated by someone only identified as Doe #4); and Clothoff.io (operated by someone only identified as Doe #5).
According to the lawsuit, several of these sites openly promote the nonconsensual nature of their deepfakes.
Drawnudes.io, for example, allegedly enables users to “deepnude girls for free” by using the website’s AI technology to ‘undress’ uploaded images.
“Users are invited to upload a photo with the message: ‘Have someone to undress?’” the complaint reads. “Sol Ecom provides step-by-step instructions on how to select images that will provide ‘good’ quality nudified results.”
Some of the named companies and sites were among those first probed by investigative reporter and researcher Kolina Koltai in February, for exposé website Bellingcat.
In the article, Koltai claimed to have identified “a loosely affiliated network of similar platforms” including Clothoff, Nudify, Undress and DrawNudes, which had “variously manipulated financial and online service providers that ban adult content and non-consensual deepfakes by disguising their activities to evade crackdowns.”
Koltai reported that DrawNudes initially appeared connected to an obscure firm called GG Technology LTD, but during her investigation DrawNudes “changed the company listed on its website to Sol Ecom Inc.,” the first defendant named in Chiu’s lawsuit.
Sol Ecom, Koltain reported, has listed addresses in Miami and Los Angeles. Both GG Technology and Sol Ecom list Ukrainian nationals as operators.