FSC Europe Starts Petition to Protect Rights of Sexuality Professionals

FSC Europe Starts Petition to Protect Rights of Sexuality Professionals

BERLIN — FSC Europe, the recently founded European chapter of industry trade group Free Speech Coalition (FSC), has released a petition and a statement calling on the European Commission to “protect the rights of sexuality professionals, artists and educators” while drafting the E.U.’s new Digital Services Act.

The FSC Europe statement follows:

As the European Union prepares to implement a new set of rules to govern digital services platforms and protect the user’s online rights, the Free Speech Coalition is petitioning for sex workers and sexuality professionals, artists and educators not to be excluded or discriminated against any longer. The organization proposes 10 steps to a safer digital space that protects the rights of those working within the sphere of sexuality — already some of society’s already most marginalized people — and to demand transparency from the platforms, which, as the European Commission itself states, "have become integral parts of our daily lives, economies, societies and democracies."

Sexual Expression is Being Banned Online

Sex in almost all its guises is being repressed in the public online sphere and on social media like never before. Accounts focused on sexuality — from sexuality professionals, adult performers and sex workers to artists, activists and LGBTQI folks, publications and organizations — are being deleted without warning or explanation and with little regulation by private companies that are currently able to enforce discriminatory changes to their terms and conditions without explanation or accountability to those affected by these changes. Additionally, in many cases it is impossible for the users to have their accounts reinstated — accounts that are often vitally linked to the users’ ability to generate income, network, organize and share information.

Unpacking the Digital Services Act (DSA)

At the same time as sexual expression is being erased from digital spaces, new legislation is being passed in the European Union to safeguard internet users’ online rights. The European Commission’s Digital Services Act and Digital Markets Act encompass upgraded rules governing digital services with their focus, in part, being building a safer and more open digital space. These rules will apply to online intermediary services used by millions every day, including major platforms such as Facebook, Instagram and Twitter. Amongst other things, they advocate for greater transparency from platforms, better protected consumers and empowered users.

With the DSA promising to “shape Europe’s digital future” and “to create a safer digital space in which the fundamental rights of all users of digital services are protected”, it’s time to demand that it’s a future that includes those working, creating, organizing and educating in the realm of sexuality. As we consider what a safer digital space can and should look like, it’s also time to challenge the pervasive and frankly puritanical notion that sexuality — a normal and healthy part of our lives — is somehow harmful, shameful or hateful.

How the DSA Can Get It Right

The DSA is advocating for “effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions." In addition to this, the Free Speech Coalition Europe demands the following:

  • Platforms need to put in place anti-discrimination policies and train their content moderators so as to avoid discrimination on the base of gender, sexual orientation, race, or profession — the same community guidelines need to apply as much to an A-list celebrity or mainstream media outlet as they do to a stripper or queer collective;
  • Platforms must provide the reason to the user when a post is deleted or account is restricted or deleted. Shadowbanning is an underhanded means for suppressing users’ voices. Users should have the right to be informed when they are shadowbanned and to challenge the decision;
  • Platforms must allow for the user to request a revision of a content moderation’s decision, platforms must ensure moderation actions take place in the users' location, rather than arbitrary jurisdictions which may have different laws or custom; e.g., a user in Germany cannot be banned by reports and moderation in the Middle East, and must be reviewed by the European moderation team;
  • Decision-making on notices of reported content as specified in Article 14 of the DSA should not be handled by automated software, as these have proven to delete content indiscriminately. A human should place final judgement;
  • The notice of content as described in Article 14.2 of the DSA should not immediately hold a platform liable for the content as stated in Article 14.3, since such a liability will entice platforms to delete indiscriminately after report for avoiding such liability, which enables organized hate groups to mass report and take down users;
  • Platforms must provide for a department (or, at the very least, a dedicated contact person) within the company for complaints regarding discrimination or censorship;
  • Platforms must provide a means to indicate whether you are over the age of 18 as well as providing a means for adults to hide their profiles and content from children (e.g. marking profiles as 18+); Platforms must give the option to mark certain content as “sensitive”;
  • Platforms must not reduce the features available to those who mark themselves as adult or adult-oriented (i.e. those who have marked their profiles as 18+ or content as “sensitive”). These profiles should then appear as 18+ or “sensitive” when accessed without log in or without set age, but should not be excluded from search results or appear as “non-existing”;
  • Platforms must set clear, consistent and transparent guidelines about what content is acceptable, however, these guidelines cannot outright ban users focused on adult themes; e.g., you could ban highly explicit pornography (e.g., sexual intercourse videos which show penetration), but you'd still be able to post an edited video that doesn’t show penetration;
  • Platforms cannot outright ban content intended for adult audiences, unless a platform is specifically for children, or >50% of their active users are children.

The petition went live on today and can be found in full here.

Copyright © 2024 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Braindance Unveils '6DOF' VR Tech

Interactive virtual reality platform Braindance has debuted its new Six Degrees of Freedom (6DOF) VR technology.

Kiiroo, Pineapple Support Launch 'Empower Hour' Series on FeelHubX YouTube Channel

Kiiroo and Pineapple Support have teamed up to launch the “Empower Hour” series on the FeelHubX YouTube channel.

Kansas Law Firm Deploys Religion, Bunk Science While Recruiting Plaintiffs Under AV Law

Kansas-based personal injury law firm Mann Wyatt Tanksley is promoting debunked scientific theories and leveraging religious affiliation against the industry while it seeks potential plaintiffs for lawsuits against adult companies under the state’s age verification law.

UK Tech Secretary Lists Age Verification Among OSA Priorities

Peter Kyle, the U.K.’s Secretary of State for Science, Innovation and Technology, on Wednesday made public a draft version of his priorities for implementing the Online Safety Act (OSA), including age verification.

AEBN Publishes Popular Seraches by Country for September, October

AEBN has released its list of popular searches from its straight and gay theaters in all 50 states and the District of Columbia.

Avery Jane Featured on 'Adult Time Podcast'

Avery Jane is the latest guest on the “Adult Time Podcast,” hosted by studio CCO Bree Mills.

FSC: Kansas Law Firm Threatens Adult Site Over Age Verification

The Free Speech Coalition (FSC) has been notified that Kansas law firm Mann Wyatt Tanksley has sent a letter threatening an adult website with a lawsuit for breaking the state's age verification law.

10th Circuit Rejects Final FSC Appeal in Utah AV Case

The United States Court of Appeals for the 10th Circuit on Monday rejected a motion by Free Speech Coalition (FSC) requesting that the full court rehear its appeal in Free Speech Coalition v. Anderson, the industry trade association’s challenge to Utah’s age verification law.

Trump Nominates Project 2025 Contributor, Section 230 Foe to Chair FCC

President-elect Donald Trump has nominated, as his pick to head the Federal Communications Commission, Brendan Carr — an author of Project 2025 who has called for gutting Section 230 protections.

Streamate's Elevate Partners With Miss Mei on Decriminalization Initiative

Streamate’s Elevate initiative is debuting a November collaboration with creator and community advocate Miss Mei that will highlight the modern criminalization of sex work.

Show More