Anti-porn campaigners analyse video titles on major porn tubes and with the help of a little stretching of the English language conclude that 1 in 8 are 'sexually violent'
|7th April 2021 |
4th April 2021. See article from bbc.co.uk
full paper from academic.oup.com
Anti porn campaigners have been cataloguing porn titles on Pornhub, XVideos and xHamster and claim that one in eight have titles describing sexually violent acts. Their use of the term 'sexually violent' is a little bizarre though, and inevitably has
been redefined to include non-violent material that the authors deem to be violent totally at odds with normal people's use of the English language.
The campaigners analysed 131,738 titles of videos that appeared on the front page of the tube websites
(without specifically searching for anything nor allowing the site to build up a profile of preferences). The campaigners claimed that
The campaigners excluded BDSM material as they seemed to have gotten confused about whether the term 'violence' applies to the genre that seems to be higher more PC than other genres.
- 8,421 (6.4%) titles included terms for family relationships and 5,785 (4.4%) titles described sexual activity between family members - the most common category of 'sexually violent' material identified in the survey
- 5,389 (4.1%) titles
referred to physical aggression or the depiction of forced sexual activity (acknowledging that performers had likely consented
- 2,966 (2.2%) titles described image-based sexual abuse, including hidden cams and upskirting
- 2,698 (1.7%)
titles described as coercion and exploitation
Pornhub's owner Mindgeek recently removed millions of videos that
had been uploaded by users who had not been verified after claims of hosting illegal content. But it commented on the clips it has allowed to remain online:
Consenting adults are entitled to their own sexual
preferences, as long as they are legal and consensual, and all kinks that meet these criteria are welcome on Pornhub.
Academic Clare McGlynn who co-authored the survey, said:
Collegue Fiona Vera-Gray and co-author of the survey, said:
It's shocking that this
is the material that the porn companies themselves are choosing to showcase to first-time users.
Sexually violent material eroticised non-consent
and distorted the boundary between sexual pleasure and sexual violence.
The survey, titled Sexual violence as a sexual script in mainstream online pornography, is published in the latest issue of The British Journal of
Criminology. with its abstract reading:
This article examines the ways in which mainstream pornography positions sexual violence as a normative sexual script by analysing the video titles found on the landing pages of
the three most popular pornography websites in the United Kingdom. The study draws on the largest research sample of online pornographic content to date and is unique in its focus on the content immediately advertised to a new user. We found that one in
eight titles shown to first-time users on the first page of mainstream porn sites describe sexual activity that constitutes sexual violence. Our findings raise serious questions about the extent of criminal material easily and freely available on
mainstream pornography websites and the efficacy of current regulatory mechanisms.
Offsite Comment: Academic Click Bate: The War On Porn Continues
7th April 2021. See article from reprobatepress.com
by David Flint
The study makes big claims that were inevitably picked up and repeated uncritically by media outlets like the BBC. But even a cursory glance at the evidence and the conclusions might make a more open-minded
person raise their eyebrows. If ever there was a study that set out in search of evidence to back up a belief already held, this is it.
article from reprobatepress.com
Government notes that porn websites without user comments or uploads will not be within the censorship regime of the upcoming Online Safety Bill
|27th March 2021 |
See article from questions-statements.parliament.uk
Written Question, answered on 24 March 2021
Baroness Grender Liberal Democrat Life peer Lords
To ask Her Majesty's Government which commercial pornography companies will be in scope of the
Online Safety Bill; and whether commercial pornography websites which
do not host user-generated content, or
allow private user communication, will also be in scope.
Baroness Barran Conservative
The government is committed to ensuring children are protected from accessing online pornography through the new online safety framework. Where
pornography sites host user-generated content or facilitate online user interaction such as video and image sharing, commenting and live streaming, they will be subject to the new duty of care. Commercial pornography sites which allow private user to
user communication will be in scope. Where commercial pornography sites do not have user-generated functionality they will not be in scope. The online safety regime will capture both the most visited pornography sites and pornography on social media,
therefore covering the majority of sites where children are most likely to be exposed to pornography.
We expect companies to use age assurance or age verification technologies to prevent children from accessing services which pose
the highest risk of harm to children, such as online pornography. We are working closely with stakeholders across industry to establish the right conditions for the market to deliver age assurance and age verification technical solutions ahead of the
legislative requirements coming into force.
The new internet censor sets outs its stall for the censorship of video sharing platforms
||24th March 2021 |
See press release from
Ofcom has published its upcoming censorship rules for video sharing platforms and invites public responses up until 2nd June 2021. For a bit of self justification for its censorship, Ofcom has commissioned a survey to find that YouTube users and the
likes are calling out for Ofcom censorship. Ofcom writes:
A third of people who use online video-sharing services have come across hateful content in the last three months, according to a new study by Ofcom.
The news comes as Ofcom
proposes new guidance for sites and apps known as 'video-sharing platforms' (VSPs), setting out
practical steps to protect users from harmful material.
VSPs are a type of online video service where users can upload and share videos with other members of the public. They allow people to engage with a wide range of content and
Under laws introduced by Parliament last year, VSPs established in the UK must take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or
hatred, as well as certain types of criminal content. Ofcom's job is to enforce these rules and hold VSPs to account.
The draft guidance is designed to help these companies understand what is expected of them under the new
rules, and to explain how they might meet their obligations in relation to protecting users from harm.
Harmful experiences uncovered
To inform our approach, Ofcom has researched how people in the UK
use VSPs, and their claimed exposure to potentially harmful content. Our major findings are:
Hate speech. A third of users (32%) say they have witnessed or experienced hateful content. Hateful content was most often directed towards a racial group (59%), followed by religious groups (28%), transgender people (25%)
and those of a particular sexual orientation (23%).
Bullying, abuse and violence. A quarter (26%) of users claim to have been exposed to bullying, abusive behaviour and threats, and the same proportion came across
violent or disturbing content.
Racist content. One in five users (21%) say they witnessed or experienced racist content, with levels of exposure higher among users from minority ethnic backgrounds (40%), compared to
users from a white background (19%).
Most users encounter potentially harmful videos of some sort. Most VSP users (70%) say they have been exposed to a potentially harmful experience in the last three months,
rising to 79% among 13-17 year-olds.
Low awareness of safety measures. Six in 10 VSP users are unaware of platforms' safety and protection measures, while only a quarter have ever flagged or reported harmful content.
Guidance for protecting users
As Ofcom begins its new role regulating video-sharing platforms, we recognise that the online world is different to other regulated sectors. Reflecting the nature of
video-sharing platforms, the new laws in this area focus on measures providers must consider taking to protect their users -- and they afford companies flexibility in how they do that.
The massive volume of online content means it
is impossible to prevent every instance of harm. Instead, we expect VSPs to take active measures against harmful material on their platforms. Ofcom's new guidance is designed to assist them in making judgements about how best to protect their users. In
line with the legislation, our guidance proposes that all video-sharing platforms should provide:
Clear rules around uploading content. VSPs should have clear, visible terms and conditions which prohibit users from uploading the types of harmful content set out in law. These should be enforced effectively.
Easy flagging and complaints for users. Companies should implement tools that allow users to quickly and effectively report or flag harmful videos, signpost how quickly they will respond, and be open about any action taken.
Providers should offer a route for users to formally raise issues or concerns with the platform, and to challenge decisions through dispute resolution. This is vital to protect the rights and interests of users who upload and share content.
Restricting access to adult sites. VSPs with a high prevalence of pornographic material should put in place effective age-verification systems to restrict under-18s' access to these sites and apps.
Enforcing the rules
Ofcom's approach to enforcing the new rules will build on our track record of protecting audiences from harm, while upholding freedom of expression. We will consider the unique
characteristics of user-generated video content, alongside the rights and interests of users and service providers, and the general public interest.
If we find a VSP provider has breached its obligations to take appropriate
measures to protect users, we have the power to investigate and take action against a platform. This could include fines, requiring the provider to take specific action, or -- in the most serious cases -- suspending or restricting the service.Consistent
with our general approach to enforcement, we may, where appropriate, seek to resolve or investigate issues informally first, before taking any formal enforcement action.
We are inviting
all interested parties to comment on our proposed draft guidance, particularly services which may fall within scope of the regulation, the wider industry and third-sector bodies. The deadline for responses is 2 June 2021. Subject to feedback, we plan to
issue our final guidance later this year. We will also report annually on the steps taken by VSPs to comply with their duties to protect users.
Ofcom has been given new powers to regulate
UK-established VSPs. VSP regulation sets out to protect users of VSP services from specific types of harmful material in videos. Harmful material falls into two broad categories under the VSP Framework, which are defined as:
Restricted Material , which refers to videos which have or would be likely to be given an R18 certificate, or which have been or would likely be refused a certificate. It also includes other material that might impair the
physical, mental or moral development of under-18s.
Relevant Harmful Material , which refers to any material likely to incite violence or hatred against a group of persons or a member of a group of persons based on
particular grounds. It also refers to material the inclusion of which would be a criminal offence under laws relating to terrorism; child sexual abuse material; and racism and xenophobia.
The Communications Act sets out the criteria for determining jurisdiction of VSPs, which are closely modelled on the provisions of the Audiovisual Media Services Directive. A VSP will be within UK jurisdiction if it has the required
connection with the UK. It is for service providers to assess whether a service meets the criteria and notify to Ofcom that they fall within scope of the regulation. We recently
published guidance about the criteria to assist them in making this assessment. In December 2020,
the Government confirmed its intention to appoint Ofcom as the regulator of the
future online harms regime . It re-stated its intention for the VSP Framework to be superseded by the regulatory framework in new Online Safety legislation.
Utah Governor signs law requiring internet devices sold locally to be pre-loaded with Net Nanny like porn blocking software
|24th March 2021
The Republican governor of Utah has signed silly legislation requiring all cellphones and tablets sold in the conservative state to be sold with software that automatically blocks pornography.
Governor Spencer Cox claims the measure would send an
important message about preventing children from accessing explicit online content.
In fact the legislation is mere virtue signalling and makes no meaningful proposals how its requirements can be implemented in practice. So there is a get out
clause that says no immediate steps toward implementation will be made unless five other states enact similar laws, a provision introduced to address concerns that it would be difficult to implement.
The American Civil Liberties Union of Utah said
the constitutionality of the bill was not adequately considered and that it will likely be argued in court.
An internet porn age verification bill progresses in Canada
|19th March 2021 |
See article from sencanada.ca
A bill has passed 2nd reading in the Canadian Senate that would require porn websites to implement age verification for users.
Bill S-203, An Act to restrict young persons' online access to sexually explicit material, will now be referred to the
Standing Senate Committee on Legal and Constitutional Affairs.
A diverse group of organisations criticise Australia's hastily drafted and wide ranging internet censorship bill
|19th March 2021
5th March 2021. See article from innovationaus.com
article from theguardian.com
article from zdnet.com
A number of legal, civil and digital rights, tech companies and adult organisations have raised significant concerns with Australia's proposed internet censorship legislation, and its potential to impact those working in adult industries, to lead to
online censorship, and the vast powers it hands to a handful of individuals.
Despite this, the legislation was introduced to Parliament just 10 days after the government received nearly 400 submissions on the draft bill, and the senate committee is
expected to deliver its report nine days after submissions closed. Stakeholders were also given only three working days to make a submission to the inquiry.
In a submission to the inquiry, Australian Lawyers Alliance (ALA) president Graham
Droppert said the government should not proceed with the legislation because it invests excessive discretionary power in the eSafety Commissioner and also the Minister with respect to the consideration of community expectations and values in relation to
online content. Droppert said:
Digital Rights Watch has been leading the charge against the legislationn. Digital Rights Watch programme director Lucie Krahulcova said:
The ALA considers that the bill does not strike the appropriate balance between protection against abhorrent material and due process for determining whether content comes within that
Jarryd Bartle is a lecturer in criminal law and adult industry consultant, and is policy and campaigns advisor at the Eros Association. He said:
The powers to be handed to the eSafety
Commissioner, which was established in 2015 to focus on keeping children safe online, is a continuation of its broadly expanding remit, and should be cause for concern.
The new powers in the bill are discretionary and open-ended,
giving all the power and none of the accountability to the eSafety Office. They are not liable for any damage their decisions may cause and not required to report thoroughly on how and why they make removal decisions. This is a dramatic departure from
democratic standards globally.
Twitter and live streaming service Twitch have joined the mounting list of service providers, researchers, and civil liberties groups that take issue with Australia's pending Online Safety Bill.
The bill as drafted is
blatant censorship, with the eSafety commissioner empowered to strip porn, kink and sexually explicit art from the internet following a complaint, with nothing in the scheme capable of distinguishing moral panic from genuine harm.
Of concern to both Twitter and Twitch is the absence of due regard to different types of business models and content types, specifically around the power given to the relevant minister to determine basic online safety expectations for social media
services, relevant electronic services, and designated internet services. Twitter said:
In order to continue to foster digital growth and innovation in the Australian economy, and to ensure reasonable and fair
competition, it is critically important to avoid placing requirements across the digital ecosystem that only large, mature companies can reasonably comply with,
Likewise, Twitch believes it is important to consider a sufficiently
flexible approach that gives due regard to different types of business models and content types.
Update: Fast tracked
19th March 2021. See
article from ia.acs.org.au The Online Safety Bill 2021 will likely get an easy ride into law after a senate environment
and communications committee gave it the nod of approval last week.
Under the government's proposed laws, the eSafety Commissioner will be given expanded censorship powers to direct social media platforms and other internet services to take down
material and remove links to content it deems offensive or abusive.
So peer Floella Benjamin attempts to revive porn age verification censorship because porn viewing is just one step away from park murder
|17th March 2021 |
14th March 2021. See article from bills.parliament.uk
The pro-censorship member of the House of Lords has tabled the following amendment to the Domestic Abuse Bill to reintroduce internet porn censorship and age verification requires previously dropped by the government in October 2019.
introduces a new clause:
Impact of online pornography on domestic abuse
Within three months of the day on which this Act is passed, the Secretary of State must commission a person appointed by the Secretary of State to investigate the impact of access to online pornography by children on domestic
Within three months of their appointment, the appointed person must publish a report on the investigation which may include recommendations for the Secretary of State.
As part of the
investigation, the appointed person must consider the extent to which the implementation of Part 3 of the Digital Economy Act 2017 (online pornography) would prevent domestic abuse, and may make recommendations to the Secretary of State accordingly.
Within three months of receiving the report, the Secretary of State must publish a response to the recommendations of the appointed person.
If the appointed person recommends that Part 3 of the Digital
Economy Act 2017 should be commenced, the Secretary of State must appoint a day for the coming into force of that Part under section 118(6) of the Act within the timeframe recommended by the appointed person."
Member's explanatory statement
This amendment would require an investigation into any link between online pornography and domestic abuse with a view to implementing recommendations to bring into effect
the age verification regime in the Digital Economy Act 2017 as a means of preventing domestic abuse.
17th March 2021. See
article from votes.parliament.uk
The amendment designed to resurrect the Age Verification clauses of the Digital Economy Act 2017 was
defeated by 242 to 125 vodets in the House of Lords.
The government minister concluding the debate noted that the new censorship measures included in the Online Harms Bill are more comprensive than the measures under Digital Economy Act 2017. He
also noted that although upcoming censorship measures would take significant time to implement but also noted that reviving the old censorship measures would also take time.
In passing the minister also explained one of the main failings of the
act was that site blocking would not prove effective due to porn viewers being easily able to evade ISP blocks by switching to encrypted DNS servers via DNS over Https (DoH). Presumably government internet snooping agencies don't fancy losing the ability
to snoop on the browsing habits of all those wanting to continue viewing a blocked porn site such as Pornhub.
If anyone is stupid enough to base a video sharing internet service in the UK, then you will have to sign up for censorship by Ofcom before 6th May 2021. After a year you will have to pay for the privilege too
See statement from ofcom.org.uk
See censorship rules for video sharing platforms from ofcom.org.uk
guidance for those having to sign up for censorship from ofcom.org.uk
Ofcom has published guidance to help providers self-assess whether they need to notify to Ofcom as UK-established video-sharing platforms.
Video-sharing platforms (VSPs) are a type of online video service which allow users to
upload and share videos with the public.
Under the new VSP regulations , there are specific legal criteria which determine whether a service meets the definition of a VSP, and whether it falls within UK jurisdiction. Platforms
must self-assess whether they meet these criteria, and those that do will be formally required to notify to Ofcom between 6 April and 6 May 2021. Following consultation, we have today published our final guidance to help service providers to make this
Miserable Thai Government will continue to block Pornhub
||7th March 2021 |
article from thepattayanews.com
The miserable Thai Government, through the acting Minister of Digital Economy and Society, has confirmed that popular adult website Pornhub will stay blocked in Thailand, giving the reason that the website allegedly encourages poor moral standpoints and
can affect youth in a negative manner,
Itthipol Khunplume, the current Minister of Culture (who was previously the Mayor of Pattaya, which allegedly has many adult entertainment-oriented businesses), made the statement to the Associated Thai press.
This decision also has banned hundreds of other prominent adult websites which, according to the Thai Government, are obscene and conflict with good morals for upstanding citizens.
Australian internet child protection bill inevitably slips in censorship capability to block adult consensual porn
|1st March 2021 |
article from theguardian.com
Fast-tracked internet censorship legislation could ban all adult content online and force sex workers off the internet, sex workers and civil liberties groups have warned.
The Online Safety bill is supposedly aimed at giving powers to Australia's
eSafety commissioner to target bullying and harassment online, extending existing powers protecting children from online bullying to adults.
It increases the maximum penalty for using a carriage service to menace, harass or cause offence from
three to five years in jail, and allows for the removal of image-based abuse and other supposedly harmful online content.
The legislation also promotes the the eSafety commissioner to a new post of Internet Censor and gives her the power to
rapidly block sites hosting violent and terrorist content. But the proposals go further. The bill carries over existing powers under the Broadcasting Services Act which allow for content with a rating of R18+ (equivalent to the UK 18 rating) to be
blocked or for removal notices to be issued, and goes much further by giving the Internet Censor sole discretion over whether the content is rated R18+ or over and therefore should be removed.
A Sex Work Law Reform Victoria spokesperson, Roger
Sorrenti, said the legislation's effect would be to effectively censor adult online content that could potentially have unintended consequences for the sex industry and porn industries and have a devastating impact on the ability of sex workers to earn a
legitimate income. Consultation for the draft legislation attracted more than 370 submissions between 23 December and 14 February, none of which the government published before the communications minister, Paul Fletcher, introduced the legislation
into parliament 10 days later. The bill has been referred to a Senate committee, with submissions due on Tuesday. The government has decided to fast-track this bill despite repeated calls for caution by the industry and civil liberties organisations, as
well as a parallel review currently occurring into Australia's classification scheme.
The eSafety commissioner, Julie Inman Grant, claimed to Guardian Australia that she didn't intend to use her powers under the legislation to go after consensual
adult pornographic material online. But she ominously pointed out that hosting explicit adult sexual content is prohibited in Australia. Guardian Australia has also seen a notice sent by her office in January to adult websites requesting that content be
removed for being R18+, X or refused classification.
North Korea offers a series of supervisory options to punish children for watching porn
|25th February 2021 |
See article from dailynk.com
North Korea is stepping up punishments and intensifying a crackdown based on the anti-reactionary thought law adopted at the end of last year. The law seems to have strengthened the authorities' repression and control over citizens in the country.
According to a source in North Pyongan Province, a teenage boy who was caught watching pornography at his home in Sinuiju earlier this month has been exiled to the countryside along with his parents.
The teenager was watching a pornographic
video late at night when his parents were not at home. He was caught during a surprise inspection by a task force created to monitor deviant behavior.
Article 29 of the new law calls for sentences of five to 15 years of correctional labor
for consumption or possession of pornographic videos or books, photos or drawings that preach superstition. Individuals who produce, import or distribute such materials may get life sentences of correctional labor or even the death penalty, depending on
the quantity of the material.
However, it appears that because the anti-reactionary thought law does not prescribe punishment regulations for adolescents, the punishment was set to deportation instead of correctional labor. Articles 34-38 of the
law stipulate fines of KPW 100,000 to 200,000 if a reactionary thought crime occurs due to the irresponsible education of children and orders the entire family to move to the countryside as punishment for the parents.
Utah House of Representatives passes silly bill to require porn blockers on new mobile devices
|22nd February 2021 |
See article from xbiz.com
The idea of adding censorship software to new phones and tablets sold in Utah has been well debated by moralists in various US state assemblies. But none are quite as silly as Utah when it comes to enacting stupid ideas without a moments thought for the
practicality of the requirement.
Now the Utah House of Representatives passed an amended version of a controversial bill that would mandate a default porn filter on any phones, computers, tablets or any other electronic devices sold in the state
starting in 2022.
HB 72 , sponsored by Representative Susan Pulsipher, a realtor with no technology experience, was speedily passed by the House only hours after it had cleared the committee stage by the narrowest of margins (a 6-5 vote), as XBIZ
reported. The bill was introduced into the Utah Senate yesterday, where it is co-sponsored by staunch anti-porn campaigner Wayne A. Harper.
California state senator introduces a bill targeting Pornhub
|20th February 2021 |
See article from avn.com
A California state senator has introduced a bill that would make online sites liable for non-consensual sexual images and videos uploaded by users.
Titled the Ending Online Sexual Trafficking and Exploitation Act , SB-435 sponsored by San Jose
Democrat Dave Cortese would allow alleged victims of non-consensual image sharing to sue sites and platforms that display the image, potentially collecting damages of $100,000 for every two hours the image remains online after a takedown notification.
The bill requires that sites must take down any such non-consensual image upon receipt of a takedown demand by a person seen in the image.
Police from the Indian state of Uttar Pradesh set up a team to snoop on people's porn searches
|15th February 2021 |
See article from theswaddle.com
Police from the Indian state of Uttar Pradesh announced the creation of a team to snoop on people's internet searches for pornographic material. The force has hired a company to surveille searches and keep data of the people who search for porn content.
In India, pornography is banned by the government, but the initial stages of the lockdown last year saw a 95% rise in viewership . The U.P. police's internet search tracking plan is being piloted in six of the state's districts. The monitoring will
now be carried out across the state, which currently has about 11.6 million internet users.
The U.P. police has outsourced its monitoring of porn searches to the company Oomuph. If Oomuph spots an internet user consuming pornography, the police's
analytics team will receive information on the user and search. Porn searches on the internet will also now yield an awareness message that searchers are being tracked by the police..
Floella Benjamin attempts to resuscitate internet porn age verification in a Domestic Abuse Bill
|11th February 2021
See Government statement about age verification (11th
January 2021) from questions-statements.parliament.uk
attempt to resuscitate porn age verification in the
Domestic Abuse Bill (10th February 2021) from hansard.parliament.uk
Campaigners for the revival of deeply flawed and one sided age verification for porn scheme have been continuing their efforts to revive it ever since it was abandoned by the Government in October 2019.
The Government was asked about the possibility
of restoring it in January 2021 in the House of Commons. Caroline Dinenage responded for the government:
The Government announced in October 2019 that it will not commence the age verification provisions of Part 3 of
the Digital Economy Act 2017 and instead deliver these protections through our wider online harms regulatory proposals.
Under our online harms proposals, we expect companies to use age assurance or age verification technologies to
prevent children from accessing services which pose the highest risk of harm to children, such as online pornography. The online harms regime will capture both the most visited pornography sites and pornography on social media, therefore covering the
vast majority of sites where children are most likely to be exposed to pornography. Taken together we expect this to bring into scope more online pornography currently accessible to children than would have been covered by the narrower scope of the
Digital Economy Act.
We would encourage companies to take steps ahead of the legislation to protect children from harmful and age inappropriate content online, including online pornography. We are working closely with stakeholders
across industry to establish the right conditions for the market to deliver age assurance and age verification technical solutions ahead of the legislative requirements coming into force.
In addition, Regulations transposing the
revised Audiovisual Media Services Directive came into force on 1 November 2020 which require UK-established video sharing platforms to take appropriate measures to protect minors from harmful content. The Regulations require that the most harmful
content is subject to the strongest protections, such as age assurance or more technical measures. Ofcom, as the regulatory authority, may take robust enforcement action against video sharing platforms which do not adopt appropriate measures.
Now during the passage of the Domestic Abuse in the House of Lords, Floella Benjamin attempted to revive the age verification requirement by proposing the following amendment:
Insert the following new
Impact of online pornography on domestic abuse
(1) Within three months of the day on which this Act is passed, the Secretary of State must commission a person appointed by the
Secretary of State to investigate the impact of access to online pornography by children on domestic abuse.
(2) Within three months of their appointment, the appointed person must publish a report on the investigation which may
include recommendations for the Secretary of State.
(3) As part of the investigation, the appointed person must consider the extent to which the implementation of Part 3 of the Digital Economy Act 2017 (online pornography) would
prevent domestic abuse, and may make recommendations to the Secretary of State accordingly.
(4) Within three months of receiving the report, the Secretary of State must publish a response to the recommendations of the appointed
(5) If the appointed person recommends that Part 3 of the Digital Economy Act 2017 should be commenced, the Secretary of State must appoint a day for the coming into force of that Part under section 118(6) of the Act
within the timeframe recommended by the appointed person.
Member's explanatory statement
This amendment would require an investigation into any link between online pornography and
domestic abuse with a view to implementing recommendations to bring into effect the age verification regime in the Digital Economy Act 2017 as a means of preventing domestic abuse.
Floella Benjamin made a long speech supporting the
censorship measure and was supported by a number of peers. Of course they all argued only from the 'think of the children' side of the argument and not one of them mentioned trashed adult businesses and the risk to porn viewers of being outed, scammed,
See Floella Benjamin's
speech from hansard.parliament.uk
Australia's adult trade association argues against mainstream adult material being considered as an online 'harm'
See article from businessinsider.com.au
consultation response [pdf] from eros.org.au
Australia's trade association for the adult industry has slammed a proposed law that they claim would harm the livelihood of their workers and limit Australian's sexual impression under the guise of protecting citizens.
The Eros Association has made a
submission to the consultation for the government's Online Safety Act, a law that expands the powers of the eSafety Commissioner to censor what the government considers are online harms. The Australian internet censorship proposal includes enabling
censors to force the takedown of content within statutory timeframes, removal of accounts and even delisting from search engines. The proposed censorship remit includes sexually explicit content from consenting adults. Note that Australia has always had
a problem with even vanilla hardcore and the vast majority of the country bans such material from sale in sex shops and from being hosted on Australian websites.
Eros Association policy and campaigns advisor Jarryd Bartle commented:
You could have your business ruined in 24 hours if a complaint is made and a removal notice is issued, your content is taking down, accounts taken down and website taken down in the case of fetish content.
consultation submission notes:
It is the position of Eros that:
The online content scheme under Part 9 of the Bill should be removed as it is not related to issues of online safety and is likely to harm the livelihood of sex workers, adult media performers and adults-only businesses.
The role of the eSafety Commissioner should be to focus on non-consensual, abusive and harmful content and not imagery of consensual sexual activity between adults.
Under this Bill, adult media for online content regulation likely encompasses advertising for sex work services, adult entertainment and adult retailing, impacting a broad range of industries.
As drafted, the
online content scheme would provide for the removal of many forms of adult content impacting the livelihood of producers, sex workers, adult retailers and adult entertainment venues.
The scheme is so broad reaching it would also
limit the sexual expression of Australians online whether or not they are posting sexually explicit content for profit.
It is Eros' view that the proposed scheme is not in keeping with community standards. Previous government
attempts to filter sexual ly explicit content online have proven very unpopular, and were widely viewed as an infringement on freedom of speech.
The overwhelming majority of Australian pornography users note that adult media has
had a 'positive' or 'neutral' impact on their life. It is therefore inappropriate to regulate this content within a Bill designed to tackle online harms .
Pornhub comes under scrutiny in the Canadian Parliament
|6th February 2021 |
2nd February 2021. See article from xbiz.com
The Committee on Access to Information, Privacy and Ethics in Canada's House of Commons held a hearing concerning allegations made against Pornhub's content moderation policies. The allegations featured in a New York Times article by Nicholas Kristof and
were based on a religious group Exodus Cry's Traffickinghub campaign against the tube site and parent company MindGeek.
MindGeek is headquartered in Luxembourg, although many of its operations are run from Montreal and the two people identified by the
New York Times as owners are Canadian nationals.
The committee heard from a witness who retold her story of having difficulties getting Pornhub to remove a video she had shot of herself as a teenager, which then she sent to a boyfriend and which
was allegedly repeatedly uploaded onto the tube site by unidentified third parties.
The committee also heard from New York lawyer Michael Bowe, who has previously represented disgraced evangelist Jerry Falwell Jr. and Donald Trump. Bowe made
repeated claims about a supposed conspiracy masterminded by MindGeek, and their agents and allies to gaslight the public opinion about the organized international campaign against Pornhub. Bowe also asked for Canada to change their laws to make MindGeek
accountable, and stated that in his opinion the company committed criminal offenses.
Update: Pornhub Releases Statement About Content Moderation Changes
6th February 2021. See
statement from help.pornhub.com
Going forward, we will only allow properly identified users to upload content. We have banned downloads. We have made some key expansions to our moderation process, and we recently launched a Trusted Flagger Program with dozens of non-profit
organizations. Earlier this year, we also partnered with the National Center for Missing & Exploited Children, and next year we will issue our first transparency report. Full details on our expanded policies can be found below.
If you wish to report any content that violates our terms of service, including CSAM or other illegal content, please click this link .
1. Verified Uploaders Only
immediately, only content partners and people within the Model Program will be able to upload content to Pornhub. In the new year, we will implement a verification process so that any user can upload content upon successful completion of identification
2. Banning Downloads
Effective immediately, we have removed the ability for users to download content from Pornhub, with the exception of paid downloads within the verified Model Program.
In tandem with our fingerprinting technology, this will mitigate the ability for content already removed from the platform to be able to return.
3. Expanded Moderation
We have worked to create
comprehensive measures that help protect our community from illegal content. In recent months we deployed an additional layer of moderation. The newly established "Red Team" will be dedicated solely to self-auditing the platform for potentially
illegal material. The Red Team provides an extra layer of protection on top of the existing protocol, proactively sweeping content already uploaded for potential violations and identifying any breakdowns in the moderation process that could allow a piece
of content that violates the Terms of Service. Additionally, while the list of banned keywords on Pornhub is already extensive, we will continue to identify additional keywords for removal on an ongoing basis. We will also regularly monitor search terms
within the platform for increases in phrasings that attempt to bypass the safeguards in place. Pornhub's current content moderation includes an extensive team of human moderators dedicated to manually reviewing every single upload, a thorough system for
flagging, reviewing and removing illegal material, robust parental controls, and utilization of a variety of automated detection technologies. These technologies include:
CSAI Match, YouTube's proprietary technology for combating Child Sexual Abuse Imagery online
Content Safety API, Google's artificial intelligence tool that helps detect illegal imagery
PhotoDNA, Microsoft's technology that aids in finding and removing known images of child exploitation
Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized
materials to protect against banned videos being re-uploaded to the platform.
If a user encounters a piece of content they think may violate the Terms of Service, we encourage them to immediately flag the video or fill out the Content Removal Request Form , which is linked on every page.
Our policy is to immediately disable any content reported in the Content Removal Request Form for review.
4. Trusted Flagger Program
We recently launched a Trusted Flagger
Program, a new initiative empowering non-profit partners to alert us of content they think may violate our Terms of Service. The Trusted Flagger Program consists of more than 40 leading non-profit organizations in the space of internet and child safety.
Our partners have a direct line of access to our moderation team, and any content identified by a Trusted Flagger is immediately disabled. Partners include: Cyber Civil Rights Initiative (United States of America), National Center for Missing &
Exploited Children (United States of America), Internet Watch Foundation (United Kingdom), Stopline (Austria), Child Focus (Belgium), Safenet (Bulgaria), Te Protejo Hotline - I Protect You Hotline (Colombia), CZ.NIC - Stop Online (Czech Republic ), Point
de Contact (France), Eco-Association of the Internet Industry (Germany), Safeline (Greece), Save the Children (Iceland), Latvian Internet Association (Latvia), Meldpunt Kinderporno - Child Pornography Reporting Point (Netherlands), Centre for Safer
Internet Slovenia (Slovenia), FPB Hotline - Film and Publication Board (South Africa), ECPAT (Sweden), ECPAT (Taiwan).
5. NCMEC Partnership
Last year, we voluntarily partnered with the National
Center for Missing & Exploited Children (NCMEC) in order to transparently report and limit incidents of CSAM on our platform. In early 2021, NCMEC will release our total number of reported CSAM incidents alongside numbers from other major social and
content platforms. We will also continue to work with law enforcement globally to report and curb any issues of illegal content.
6. Transparency Report
In 2021, we will release a Transparency Report
detailing our content moderation results from 2020. This will identify not just the full number of reports filed with NCMEC, but also other key details related to the trust and safety of our platform. Much like Facebook, Instagram, Twitter and other tech
platforms, Pornhub seeks to be fully transparent about the content that should and should not appear on the platform. This will make us the only adult content platform to release such a report.
7. Independent Review
As part of our commitment, in April 2020 we hired the law firm of Kaplan Hecker & Fink LLP to conduct an independent review of our content compliance function, with a focus on meeting legal standards and eliminating all
non-consensual content, CSAM and any other content uploaded without the meaningful consent of all parties. We requested that the goal of the independent review be to identify the requisite steps to achieve a "best-in-class" content compliance
program that sets the standard for the technology industry. Kaplan Hecker & Fink LLP is continuing its review, but has already identified and categorized a comprehensive inventory of remedial recommendations, supported by dozens of additional
sub-recommendations, in addition to the steps identified above, based on an evaluation and assessment of our current policies and practices. Kaplan Hecker & Fink LLP is soliciting information to assist with its review and in developing
recommendations regarding our compliance policies and procedures. If you would like to provide compliance suggestions, you can do so here .