Instagram Addresses Confusing New Content Moderation Policy

Instagram Addresses Confusing New Content Moderation Policy

MENLO PARK, Calif. — Instagram published a blog post today announcing some changes in the app's moderation policy and appeals process.

The announcement outlined some changes that representatives of the Facebook-owned company had mentioned last month at the unprecedented meeting between the Instagram public policy team and a group of representatives from the Adult Performers Actors Guild (APAG) at Facebook's monumental Bay Area headquarters. 

XBIZ attended the meeting and published an exclusive report the same day, chronicling the conversation between the sex workers’ advocacy group and several top-level Facebook/Instagram execs responsible for deciding which content is allowed on their platforms.

Today’s blog post described the new policy in the typically vague, imperious language common among the leadership of Facebook and other social media giants. With a combination of august arrogance, Sesame-Street-like language that is meant to appear friendly, harmless and inclusive, and a tone of finality and knowing-best, the statement uncannily mimics the public speech patterns of company figurehead Mark Zuckerberg.

“Today, we are announcing a change to our account disable [sic] policy,” reads the anonymous blog post, which an Instagram spokesperson confirmed as “official” and “quotable.” “Together with Facebook, we develop policies to ensure Instagram is a supportive place for everyone. These changes will help us quickly detect and remove accounts that repeatedly violate our policies.”

“Under our existing policy,” the statement continues, “we disable accounts that have a certain percentage of violating content. We are now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time.”

Besides this extremely vague reframing of Instagram's still-secret formula for content censorship, the company also introduced “a new notification process to help people understand if their account is at risk of being disabled.”

“This notification will also offer the opportunity to appeal content deleted,” Instagram stated.

These appeals “will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we’ll be expanding appeals in the coming months. If content is found to be removed in error, we will restore the post and remove the violation from the account’s record.”

"We’ve always given people the option to appeal disabled accounts through our Help Center, and in the next few months, we’ll bring this experience directly within Instagram," the short blog post concluded.

Sex Workers' Concerns Remain

XBIZ contacted an Instagram spokesperson with relevant questions about the vexing new blogpost.

The company says that the goal is to make Instagram “a supportive place for everyone,” so we asked the rep whether legal sex workers and producers of adult content are included in “everyone.”

“This includes everyone who uses Instagram, irrespective of their profession,” the rep answered.

What is the rational in equating “nudity and pornography” with “bullying and harassment, hate speech, drug sales” and “terrorism”? Is Instagram implying that nudity (specified as female nipples and male and female genitalia and buttocks) and pornography (an often derogatory, stigmatizing term for adult content) is equivalent to all those other types of harmful content? (“Drug sales” is also problematic: how does Facebook define “drugs”?)

“No,” the rep clarified. “We have a set of rules that govern what you can or can’t post on Instagram — that’s not to say we equate the severity of one rule with another. Simply, this is a list of things we don’t allow. We do not allow the sales of any drug on Instagram, including illicit and pharmaceutical drugs.”

The blog post states that Instagram is “now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time.” Does Instagram intend to keep the “certain percentage,” “certain number” and “window of time” figures a secret, or will it announce what these numbers actually are?

“We don’t share these numbers as it allows bad actors to game our preventative measures,” said the rep.

What does “we’ll bring [the appeals] experience directly within Instagram” even mean?

“Currently people need to go to our Help Centre (a website) to appeal accounts that are removed from Instagram,” clarified the rep. “With this upcoming update, people can appeal directly in the app. In other words, this process will be a lot simpler for our community.”

Finally, we asked if Instagram was committed to keeping the platform “a safe and supportive place” for adults and sex workers?

“See above,” was the Instagram spokesperson’s succinct answer.

For more background on Instagram and the ongoing War on Porn, click here for the XBIZ Explainer and here for an account of the historic meeting between APAG leadership and the Facebook/Instagram public policy team.

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Trump Administration Issues Executive Order Against 'Debanking'

The White House on Thursday issued an executive order limiting financial institutions’ ability to restrict access to financial services for people or groups involved in lawful industries, a longtime goal of adult industry advocates and stakeholders.

Go.cam Launches Free Age Verification Solution, Anti-Fraud Features

Go.cam has announced that its age verification solution is now free with updated anti-fraud and identity protection features.

Florida AG Sues EU-Based Adult Companies for Failing to Age-Verify Users

Florida Attorney General James Uthmeier filed a lawsuit Monday with the 12th Judicial Circuit Court of Florida against five EU-based adult companies for allegedly failing to require age verification before allowing access to adult content.

SkyPrivate Launches 'Telegram Pay-Per-Minute' Feature

SkyPrivate has launched a new pay-per-minute (PPM) private show option on Telegram.

Pineapple Support to Host 'Money and Mental Health' Online Event

Pineapple Support is hosting a free, online event to help performers balance financial wellbeing with mental health, Aug. 18-19.

Arcom Warns 5 Adult Sites Over Age Verification

French media regulator Arcom has sent enforcement notices to the operators of five adult websites that the agency says have failed to implement age verification as required under France’s Security and Regulation of the Digital Space (SREN) law.

MojoHost Debuts NVIDIA Blackwell-Powered Hosting

MojoHost has announced the launch of NVIDIA Blackwell-powered hosting featuring RTX 6000 Pro MaxQ GPUs.

FSC: Identity Theft Targeting Adult Performers

The Free Speech Coalition has put out an alert warning of an individual found to be targeting adult performers for identity theft.

Assylum.com Implements New Age Verification System

Assylum.com has introduced an age verification system across its member sites.

European Commission to Assess Pornhub, XVideos, XNXX Compliance With Digital Services Act

The European Commission plans to conduct a study to determine how well adult sites Pornhub, XVideos and XNXX are addressing illegal content and other potential harms under the EU’s Digital Services Act.

Show More