Google's New Explicit Image Takedown Policy Unlikely to Affect Commercial Images

Google's New Explicit Image Takedown Policy Unlikely to Affect Commercial Images

MOUNTAIN VIEW, Calif. — Google’s updated policies allowing individuals to remove “personal, explicit images” from Google Search results will not affect most commercial images created by a third party with appropriate contracts and releases.

As XBIZ reported, the policy, which was unveiled last week, was designed to target nonconsensual explicit imagery and to enable individuals “to remove from Search any of their personal, explicit images that they no longer wish to be visible in Search,” Google VP for Trust Danielle Romain shared through the platform’s blog.

At the time, Romain specified that the new policy “doesn’t apply to content you are currently commercializing.”

Still, questions lingered among adult companies and creators about situations involving explicit images of individuals who were under contract and/or had given full releases to third-party content producers, including studios and companies.

A Google rep told XBIZ that under the new takedown policy, individuals “can request the removal of third-party created content that features them, if it has been removed by the original publisher.”

The Google rep directed XBIZ to the full text of the new policy, which states that for the company to consider the content for removal, it must meet the following requirements:

The imagery shows you (or the individual you’re representing) nude, in a sexual act, or an intimate state.

You (or the individual you’re representing) didn’t consent to the imagery or the act and it was made publicly available, or the imagery was made available online without your consent.

You are not currently being paid for this content online or elsewhere.

For non-authorized commercial content, such as pirated material, that does not fall under those requirements, Google instead recommends requesting the removal under DMCA.

Two Specific Scenarios

According to the policy, if Individual A agrees to perform in an explicit sex scene for Company B and signs a contract, release form and 2257 form, which are in the possession of Company B, but then later changes their mind and wants the content removed from Search, the content can only be removed if Company B has withdrawn it from distribution.

Under the new policy, Google would also not automatically remove content if, for example, Individual A agreed to perform in an explicit sex scene for Company B, but Company B later sold the content and transferred the rights to Company C, which marketed it in a way that Individual A disapproved of, leading Individual A to request its removal from Search.

The performer might have other options, however, particularly if the third-party publisher were found to have utilized predatory means in the production of the content featuring the reporting user. A notable example of that scenario would be the GirlsDoPorn case.

Another scenario in which the performer could request removal of search images is if the third-party producer relinquished its rights to the content.

Related:  

Copyright © 2026 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Utah Governor Signs 'Porn Tax' and VPN Rule Into Law

Governor Spencer Cox on Friday signed into law a bill to tax adult websites and make them liable if minors circumvent geolocation.

BranditScan Launches 'White Glove' Subscription Tier

BranditScan has launched its new White Glove subscription tier for creators.

German Court: Regulator Can't Block Creator's IG Account, Only Posts

A German court has ruled that while a regional media regulatory agency may block specific Instagram posts that include material deemed harmful to minors, it cannot ban an entire Instagram account due to such a post.

Brazil Lays Out Preliminary Guidelines for New AV Requirements

President Luiz Inácio Lula da Silva on Wednesday signed a decree establishing guidelines for new regulations requiring adult websites to age-verify users located in Brazil.

Senate Committee Debates Section 230 Reform

The U.S. Senate Committee on Commerce, Science, and Transportation held a hearing Wednesday on potential changes to Section 230 of the Communications Decency Act, which protects interactive computer services — including adult platforms — from liability for user-generated content.

Pearl Industry Network Offers Free Creator Memberships

Industry trade group Pearl Industry Network (PiN) has launched its free creator membership initiative.

Sam Bird Acquires Fanblast

Sam Bird, former co-director of global talent agency Surge, has acquired creator monetization tool Fanblast and named himself CEO.

'SheHerGirls' Launches Through Paysite.com

The braintrust behind PoleVixens has officially launched a new membership site, SheHerGirls, also through Paysite.com.

FTC Invites Public Comment on 'Click to Cancel' Rulemaking

The Federal Trade Commission (FTC) announced this week that it is seeking public comment on whether it should amend its Negative Option Rule to better address deceptive or unfair practices.

Aylo Rebuts Indiana AV Suit Claims Over VPN Access

Aylo this week asked a Marion Superior Court judge to dismiss Indiana’s lawsuit alleging that the company violated the state’s age verification law by failing to prevent access by users who employ VPNs and similar means to avoid geolocation.

Show More