Google's New Explicit Image Takedown Policy Unlikely to Affect Commercial Images

Google's New Explicit Image Takedown Policy Unlikely to Affect Commercial Images

MOUNTAIN VIEW, Calif. — Google’s updated policies allowing individuals to remove “personal, explicit images” from Google Search results will not affect most commercial images created by a third party with appropriate contracts and releases.

As XBIZ reported, the policy, which was unveiled last week, was designed to target nonconsensual explicit imagery and to enable individuals “to remove from Search any of their personal, explicit images that they no longer wish to be visible in Search,” Google VP for Trust Danielle Romain shared through the platform’s blog.

At the time, Romain specified that the new policy “doesn’t apply to content you are currently commercializing.”

Still, questions lingered among adult companies and creators about situations involving explicit images of individuals who were under contract and/or had given full releases to third-party content producers, including studios and companies.

A Google rep told XBIZ that under the new takedown policy, individuals “can request the removal of third-party created content that features them, if it has been removed by the original publisher.”

The Google rep directed XBIZ to the full text of the new policy, which states that for the company to consider the content for removal, it must meet the following requirements:

The imagery shows you (or the individual you’re representing) nude, in a sexual act, or an intimate state.

You (or the individual you’re representing) didn’t consent to the imagery or the act and it was made publicly available, or the imagery was made available online without your consent.

You are not currently being paid for this content online or elsewhere.

For non-authorized commercial content, such as pirated material, that does not fall under those requirements, Google instead recommends requesting the removal under DMCA.

Two Specific Scenarios

According to the policy, if Individual A agrees to perform in an explicit sex scene for Company B and signs a contract, release form and 2257 form, which are in the possession of Company B, but then later changes their mind and wants the content removed from Search, the content can only be removed if Company B has withdrawn it from distribution.

Under the new policy, Google would also not automatically remove content if, for example, Individual A agreed to perform in an explicit sex scene for Company B, but Company B later sold the content and transferred the rights to Company C, which marketed it in a way that Individual A disapproved of, leading Individual A to request its removal from Search.

The performer might have other options, however, particularly if the third-party publisher were found to have utilized predatory means in the production of the content featuring the reporting user. A notable example of that scenario would be the GirlsDoPorn case.

Another scenario in which the performer could request removal of search images is if the third-party producer relinquished its rights to the content.

Related:  

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

X3 Expo Unveils 2026 All-Stars, Show Dates Set for Jan. 16-17

X3 Expo returns to the historic Hollywood Palladium on January 16–17, bringing together fans, creators and industry insiders for North America’s largest assembly of adult entertainment stars, alongside a dazzling lineup of attractions spotlighting the cutting edge of modern media and pleasure tech.

Trump Administration Issues Executive Order Against 'Debanking'

The White House on Thursday issued an executive order limiting financial institutions’ ability to restrict access to financial services for people or groups involved in lawful industries, a longtime goal of adult industry advocates and stakeholders.

Go.cam Launches Free Age Verification Solution, Anti-Fraud Features

Go.cam has announced that its age verification solution is now free with updated anti-fraud and identity protection features.

Florida AG Sues EU-Based Adult Companies for Failing to Age-Verify Users

Florida Attorney General James Uthmeier filed a lawsuit Monday with the 12th Judicial Circuit Court of Florida against five EU-based adult companies for allegedly failing to require age verification before allowing access to adult content.

SkyPrivate Launches 'Telegram Pay-Per-Minute' Feature

SkyPrivate has launched a new pay-per-minute (PPM) private show option on Telegram.

Pineapple Support to Host 'Money and Mental Health' Online Event

Pineapple Support is hosting a free, online event to help performers balance financial wellbeing with mental health, Aug. 18-19.

Arcom Warns 5 Adult Sites Over Age Verification

French media regulator Arcom has sent enforcement notices to the operators of five adult websites that the agency says have failed to implement age verification as required under France’s Security and Regulation of the Digital Space (SREN) law.

MojoHost Debuts NVIDIA Blackwell-Powered Hosting

MojoHost has announced the launch of NVIDIA Blackwell-powered hosting featuring RTX 6000 Pro MaxQ GPUs.

FSC: Identity Theft Targeting Adult Performers

The Free Speech Coalition has put out an alert warning of an individual found to be targeting adult performers for identity theft.

Assylum.com Implements New Age Verification System

Assylum.com has introduced an age verification system across its member sites.

Show More