Google Researchers Continue Targeting Adult Content for Censorship

Google Researchers Continue Targeting Adult Content for Censorship

MOUNTAIN VIEW, Calif. — In a new paper, Google researchers continue to classify explicit sexual material in the same category as “fake, hateful, or harmful content” that requires filtering.

In the research paper about the company's proprietary AI technology, Google researchers write that although generative text-to-image AI models, like the popular Dall-E 2 viral images, have made “tremendous progress,” Google has decided not to release its version — called Imagen Video — until “concerns are mitigated” regarding potential misuse, “for example to generate fake, hateful, explicit or harmful content.”

In other words, as tech news site Tweak Town, which first flagged the Google paper, editorialized, Google “has subtly said that it won't be releasing its new video-generating artificial intelligence system over it producing gore, porn and racism.”  

Google’s researchers and policymakers view depictions of human sexuality as part of the “problematic data” they see as presenting “important safety and ethical challenges.”

The researchers remain hopeful that, one day, they can develop better tools for censoring sexual content, but conclude that, at present, “though our internal testing suggests much of explicit and violent content can be filtered out, there still exists social biases and stereotypes which are challenging to detect and filter.”

The company has therefore decided not to release Imagen Video until it can fully censor “problematic content,” including “explicit material.”

The only paper quoted by the researchers directly concerning explicit content is titled “Multimodal Datasets: Misogyny, Pornography, and Malignant Stereotypes, which similarly bundles “pornography” into a category of “troublesome” material such as rape and racist slurs.

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

GoFundMe Set Up for Danny Ferretti's Medical Expenses

A GoFundMe campaign has been set up for Fangear founder Danny Ferretti, who requires extensive lung surgery.

Byborg Acquires Cuties AI

Byborg Enterprises has acquired adult artificial intelligence startup Cuties AI.

Irish Government Releases Report on Sex Work Decriminalization Legislation

The Irish government has released a report reviewing a 2017 law that decriminalized sex work across the country.

Texas Bill Would Require Age Verification for Online Sex Toy Sales

A new bill in the Texas state legislature would require online retailers to implement age verification of purchasers before selling “obscene devices” to anyone in that state.

New York Assemblyman Proposes Banning the Term 'Sex Work'

Republican New York Assembly Member Brian Maher has introduced a bill to prohibit the use of the term "sex work" in government documents.

Age Verification Watch: Michigan Joins the AV Club, Some Laws Just Make No Sense

This roundup provides an update on the latest news and developments on the age verification front as it impacts the adult industry.

Free Speech Groups Back SCOTUS Appeal of Georgia Strip Club Tax

Two civil liberties organizations filed an amicus brief Tuesday supporting a petition asking the U.S. Supreme Court to hear an appeal in a case involving whether a tax specifically aimed at adult entertainment establishments violates the First Amendment.

Creator Networking App 'Plaiir' Launches

Plaiir, a mobile networking app for creators, has officially launched.

Swedish Court Rules LELO Products Do Not Infringe 'Invalid' Satisfyer Patent

A Swedish district court has ruled that a patent filed by Satisfyer parent company EIS GmbH is not valid, and therefore three products from pleasure brand LELO are not in violation.

North Dakota House Committee Questions Anti-Porn 'Public Health Hazard' Claim

The North Dakota House of Representatives Education Committee on Monday amended a resolution that would have recognized pornography as a “public health hazard,” instead replacing that language with a call for further study into whether such a designation is appropriate.

Show More