The Free Speech Coalition Europe is a group representing the adult trade. It has organised a petition to The Members of the European Parliament of the IMCO, JURI and LIBE Committees on the subject of how new EU internet censorship laws will impact sex
workers. The petition reads:
10 Steps to a Safer Digital Space that Protects the Rights of Sexuality Professionals, Artists and Educators
"Online platforms have become integral parts of our daily
lives, economies, societies and democracies."
Not our words but those of the European Commission. And after more than a year in the grips of a global pandemic, this statement rings truer than ever before. So why are some of
society's already most marginalised people being excluded from these necessary spaces?
Sexual Expression is Being Banned Online
Sex in almost all its guises is being repressed in the public online
sphere and on social media like never before. Accounts focused on sexuality -- from sexuality professionals, adult performers and sex workers to artists, activists and LGBTIQ folks, publications and organisations -- are being deleted without warning or
explanation and with little regulation by private companies that are currently able to enforce discriminatory changes to their terms and conditions without explanation or accountability to those affected by these changes. Additionally, in many cases it
is impossible for the users to have their accounts reinstated -- accounts that are often vitally linked to the users' ability to generate income, network, organise and share information.
Digital Services Act (DSA)
At the same time as sexual expression is being erased from digital spaces, new legislation is being passed in the European Union to safeguard internet users' online rights. The European Commission's Digital Services Act and Digital Markets Act encompass upgraded rules governing digital services with their focus, in part, building a safer and more open digital space. These rules will apply to online intermediary services used by millions every day, including major platforms such as Facebook, Instagram and Twitter. Amongst other things, they advocate for greater transparency from platforms, better-protected consumers and empowered users.
With the DSA promising to "shape Europe's digital future" and "to create a safer digital space in which the fundamental rights of all users of digital services are protected", it's time to demand that it's a
future that includes those working, creating, organising and educating in the realm of sexuality. As we consider what a safer digital space can and should look like, it's also time to challenge the pervasive and frankly puritanical notion that sexuality
-- a normal and healthy part of our lives -- is somehow harmful, shameful or hateful.
How the DSA Can Get It Right
The DSA is advocating for "effective safeguards for users, including the
possibility to challenge platforms' content moderation decisions". In addition to this, the Free Speech Coalition Europe demands the following:
Platforms need to put in place anti-discrimination policies and train their content moderators so as to avoid discrimination on the basis of gender, sexual orientation, race, or profession -- the same community guidelines need to
apply as much to an A-list celebrity or mainstream media outlet as they do to a stripper or queer collective;
Platforms must provide the reason to the user when a post is deleted or account is restricted or deleted.
Shadowbanning is an underhanded means for suppressing users' voices. Users should have the right to be informed when they are shadowbanned and to challenge the decision;
Platforms must allow for the user to request a revision
of a content moderation's decision, platforms must ensure moderation actions take place in the users' location, rather than arbitrary jurisdictions which may have different laws or custom; e.g., a user in Germany cannot be banned by reports &
moderation in the middle east, and must be reviewed by the European moderation team;
Decision-making on notices of reported content as specified in Article 14 of the DSA should not be handled by automated software, as these
have proven to delete content indiscriminately. A human should place final judgement.
The notice of content as described in Article 14.2 of the DSA should not immediately hold a platform liable for the content as stated in
Article 14.3, since such liability will entice platforms to delete indiscriminately after report for avoiding such liability, which enables organized hate groups to mass report and take down users;
Platforms must provide for
a department (or, at the very least, a dedicated contact person) within the company for complaints regarding discrimination or censorship;
Platforms must provide a means to indicate whether you are over the age of 18 as well
as providing a means for adults to hide their profiles and content from children (e.g. marking profiles as 18+); Platforms must give the option to mark certain content as "sensitive";
Platforms must not reduce the
features available to those who mark themselves as adult or adult-oriented (i.e. those who have marked their profiles as 18+ or content as "sensitive"). These profiles should then appear as 18+ or "sensitive" when accessed without a
login or without set age, but should not be excluded from search results or appear as "non-existing";
Platforms must set clear, consistent and transparent guidelines about what content is acceptable, however, these
guidelines cannot outright ban users focused on adult themes; e.g., you could ban highly explicit pornography (e.g., sexual intercourse videos that show penetration), but you'd still be able to post an edited video that doesn't show penetration;
Platforms cannot outright ban content intended for adult audiences, unless a platform is specifically for children, or >50% of their active users are children.