Google's New Explicit Image Takedown Policy Unlikely to Affect Commercial Images

Google's New Explicit Image Takedown Policy Unlikely to Affect Commercial Images

MOUNTAIN VIEW, Calif. — Google’s updated policies allowing individuals to remove “personal, explicit images” from Google Search results will not affect most commercial images created by a third party with appropriate contracts and releases.

As XBIZ reported, the policy, which was unveiled last week, was designed to target nonconsensual explicit imagery and to enable individuals “to remove from Search any of their personal, explicit images that they no longer wish to be visible in Search,” Google VP for Trust Danielle Romain shared through the platform’s blog.

At the time, Romain specified that the new policy “doesn’t apply to content you are currently commercializing.”

Still, questions lingered among adult companies and creators about situations involving explicit images of individuals who were under contract and/or had given full releases to third-party content producers, including studios and companies.

A Google rep told XBIZ that under the new takedown policy, individuals “can request the removal of third-party created content that features them, if it has been removed by the original publisher.”

The Google rep directed XBIZ to the full text of the new policy, which states that for the company to consider the content for removal, it must meet the following requirements:

The imagery shows you (or the individual you’re representing) nude, in a sexual act, or an intimate state.

You (or the individual you’re representing) didn’t consent to the imagery or the act and it was made publicly available, or the imagery was made available online without your consent.

You are not currently being paid for this content online or elsewhere.

For non-authorized commercial content, such as pirated material, that does not fall under those requirements, Google instead recommends requesting the removal under DMCA.

Two Specific Scenarios

According to the policy, if Individual A agrees to perform in an explicit sex scene for Company B and signs a contract, release form and 2257 form, which are in the possession of Company B, but then later changes their mind and wants the content removed from Search, the content can only be removed if Company B has withdrawn it from distribution.

Under the new policy, Google would also not automatically remove content if, for example, Individual A agreed to perform in an explicit sex scene for Company B, but Company B later sold the content and transferred the rights to Company C, which marketed it in a way that Individual A disapproved of, leading Individual A to request its removal from Search.

The performer might have other options, however, particularly if the third-party publisher were found to have utilized predatory means in the production of the content featuring the reporting user. A notable example of that scenario would be the GirlsDoPorn case.

Another scenario in which the performer could request removal of search images is if the third-party producer relinquished its rights to the content.

Related:  

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Ofcom Fines AVS Group $1.3 Million for AV Noncompliance

U.K. media regulator Ofcom on Wednesday imposed a penalty of one million pounds, or approximately $1.3 million, on AVS Group Ltd. after an investigation concluded that the company had failed to implement robust age checks on 18 adult websites.

Report: Aylo to Implement Age Verification in EU

Pornhub parent company Aylo plans to participate in the European Commission’s pilot program for its “white label” age verification app, according to a report by German tech news site Netzpolitik.

Missouri Lawmaker Attempts to Revive 'Health Warnings' for Adult Sites

A Missouri state representative has introduced a bill that would require adult sites to post notices warning users of alleged physical, mental, and social harms associated with pornography, despite a previous federal court ruling against such requirements.

New Age Verification Service 'BorderAge' Launches

French startup company Needemand has officially launched its subscription-based age verification solution, BorderAge.

Ruling: Italy's 'Porn Tax' Applies to All Content Creators

Italy’s tax revenue agency has ruled that the nation’s 25% “ethical tax” on income generated from adult content applies even to smaller independent online content creators.

Proposed New Hampshire AV Bill Appears to Violate Constitution

A bill in the New Hampshire state legislature, aimed at requiring adult sites to age-verify users in that state, contains a provision that seemingly contradicts the Supremacy Clause in Article VI of the U.S. Constitution.

AEBN Publishes Report on Fetish Trends

AEBN has published a report on fetish categories from its straight and gay theaters.

Online Child Protection Hearing to Include Federal AV Bill

A House subcommittee will hold a hearing next week on a slate of bills aimed at protecting minors online, including the SCREEN Act, which would make site-based age verification of users seeking to access adult content federal law.

Industry Photographer, 'Payout' Founder Mike B Passes Away

Longtime industry photographer and publisher Michael Bartholomey, known widely as Mike B, passed away Saturday.

FSC Announces 2025 Board of Directors Election Nominees

The Free Speech Coalition (FSC) has announced the nominees for its 2025 Board of Directors election.

Show More