Oversight Board Questions Instagram's Nudity, 'Sexual Solicitation' Standards

Oversight Board Questions Instagram's Nudity, 'Sexual Solicitation' Standards

LONDON — Meta’s Oversight Board — a panel of experts selected by the company to deliberate on content decisions — released a decision this week recommending the company clarify arbitrary and vague definitions concerning nudity, sexual activity and sexual solicitation.

The London-based Oversight Board issued the recommendations in the course of overturning Meta’s decisions to remove two Instagram posts in 2021 and 2022 depicting “transgender and non-binary people with bare chests.”

Both images were posted by the same account, which is maintained by “a US-based couple who identify as transgender and non-binary,” according to the document’s case summary.

According to the Oversight Board’s description, “Both posts feature images of the couple bare-chested with the nipples covered. The image captions discuss transgender healthcare and say that one member of the couple will soon undergo top surgery (gender-affirming surgery to create a flatter chest), which the couple are fundraising to pay for.”

After being flagged by Meta’s automated systems and being reported by users, Meta’s Community Standards moderators “removed both posts for violating the Sexual Solicitation Community Standard, seemingly because they contain breasts and a link to a fundraising page.”

Oversight Board Critical of Meta's Moderation Process

The document by the Oversight Board contains several passages criticizing Meta’s handling of potentially sexual content uploaded by users.

Meta’s internal guidance to moderators on when to remove content under the Sexual Solicitation policy, the Board opined, “is far broader than the stated rationale for the policy, or the publicly available guidance. This creates confusion for users and moderators and, as Meta has recognized, leads to content being wrongly removed.”

Meta’s Adult Nudity and Sexual Activity Community Standard — prohibiting “images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery” — is based on “a binary view of gender and a distinction between male and female bodies,” the Board wrote. “Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.”

Meta’s restrictions and exceptions to the rules on female nipples, the document continued, “are extensive and confusing, particularly as they apply to transgender and non-binary people. Exceptions to the policy range from protests, to scenes of childbirth, and medical and health contexts, including top surgery and breast cancer awareness. These exceptions are often convoluted and poorly defined. In some contexts, for example, moderators must assess the extent and nature of visible scarring to determine whether certain exceptions apply. The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice.”

The Board also found that Meta’s policies on adult nudity “result in greater barriers to expression for women, trans, and gender non-binary people on its platforms. For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show.”

The Board recommended that Meta “should seek to develop and implement policies that address all these concerns. It should change its approach to managing nudity on its platforms by defining clear criteria to govern the Adult Nudity and Sexual Activity policy, which ensure all users are treated in a manner consistent with human rights standards. It should also examine whether the Adult Nudity and Sexual Activity policy protects against non-consensual image sharing, and whether other policies need to be strengthened in this regard.”

A Poorly Defined Policy Against ‘Sexual Solicitation’

The Oversight Board also recommended that Meta “clarify its public-facing Sexual Solicitation policy and narrow its internal enforcement guidance to better target such violations.”

Instagram’s Community Guidelines state that “offering sexual services” is not allowed. This provision, the Board noted, “then links to Facebook’s Community Standard on Sexual Solicitation.”

Facebook’s policy reads, “We draw the line, however, when content facilitates, encourages or coordinates sexual encounters or commercial sexual services between adults. We do this to avoid facilitating transactions that may involve trafficking, coercion and non-consensual sexual acts. We also restrict sexually explicit language that may lead to sexual solicitation because some audiences within our global community may be sensitive to this type of content, and it may impede the ability for people to connect with their friends and the broader community.”

The list of contact information that triggers removal as an implicit offer of sexual solicitation, the Board noted, includes social media profile links and “links to subscription-based websites (for example, OnlyFans.com or Patreon.com).”

The Board ultimately found that the Sexual Solicitation Community Standard “contains overbroad criteria in the internal guidelines provided to reviewers. This poorly tailored guidance contributes to over-enforcement by reviewers and confusion for users. Meta acknowledged this, as it explained to the Board that applying its internal guidance could ‘lead to over-enforcement’ in cases where the criteria for implicit sexual solicitation are met but it is clear that there was ‘no intention to solicit sex.’”

Moreover, the fact that reviewers “repeatedly reached different outcomes about this content” suggested to the Board “a lack of clarity for moderators on what content should be considered sexual solicitation.”

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

UPDATED: Supreme Court Rules Against Adult Industry in Pivotal Texas AV Case

The U.S. Supreme Court on Friday issued its decision in Free Speech Coalition v. Paxton, striking a blow against the online adult industry by ruling in support of Texas’ controversial age verification law, HB 1181.

North Carolina Passes Extreme Bill Targeting Adult Sites

The North Carolina state legislature this week ratified a bill that would impose new regulations that industry observers have warned could push adult websites and platforms to ban most adult creators and content.

Supreme Court Ruling Due Friday in FSC v. Paxton AV Case

The U.S. Supreme Court will rule on Friday in Free Speech Coalition v. Paxton, the adult industry trade association's challenge to Texas’ controversial age verification law, HB 1181.

Ofcom: More Porn Providers Commit to Age Assurance Measures

A number of adult content providers operating in the U.K. have confirmed that they plan to introduce age checks in compliance with the Online Safety Act by the July 25 deadline, according to U.K. media regulator Ofcom.

Aylo Says It Will Comply With UK Age Assurance Requirements

Tech and media company Aylo, which owns various adult properties including Pornhub, YouPorn and Redtube, plans to introduce age assurance methods in the United Kingdom that satisfy government rules under the Online Safety Act, the company has announced.

Kyrgyzstan Parliament Approves Measure Outlawing Internet Porn

The Supreme Council of Kyrgyzstan on Wednesday passed legislation outlawing online adult content in the country.

Trial Set for Lawsuit by U Wisconsin Professor Fired Over Adult Content

A trial date of June 22, 2026, has been set for the civil lawsuit filed by veteran communications professor Joe Gow against the University of Wisconsin board of regents, which fired him for creating and appearing in adult content.

New UK Task Force Meets to Target Adult Content

The architect of an influential report that recommended banning adult content deemed “degrading, violent and misogynistic” has convened an “Independent Pornography Review task force” aimed at translating that report’s findings into action in the U.K.

11:11 Creations Launches Affiliate Program

11:11 Creations principal Alicia Silver has launched 11:11 Cash for creators and affiliates.

Pineapple Support, Pornhub to Host 'Self Love' Support Group

Pineapple Support and Pornhub are hosting a free online support group for performers to develop self-love.

Show More