Melon Farmers Unrated

Internet Porn Censorship


Latest

 2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   Latest 

 

Bodice ripping...

Netflix costume drama Bridgerton is a hit on porn tubes


Link Here21st January 2021
Netflix's costume drama Bridgerton is at the center of a bit of a stir.  The company's most recent hit, Bridgerton. Shonda Rhimes' first production became an instant phenomenon, particularly in its sex scenes which have been shared widely across the internet.

Media company Page Six has reported that the show's hacked sex scenes have racked up hundreds of thousands of views on adult video streaming platforms.

Netflix has responded by issuing warnings about the illegal use of its intellectual property, which has caused some clips to be deleted . But it has failed to make it happen in most cases.

An anonymous source told Britain's The Sun that actress Phoebe Dynevor is upset with the circulation of the sexy footage. The Sun added that the discomfort has spread through a good part of the cast, which could affect the recordings of the already confirmed second season.

 

 

Offsite Article: The SISEA internet sex bill...


Link Here19th January 2021
Full story: Pornhub...An ongoing target of censors
What do sex workers need to know?

See article from kulturehub.com

 

 

Offsite Article: Fifty Shades of Blame...


Link Here11th January 2021
Rough sex porn and BDSM discussed in article about Grace Millane murder trial

See article from 9news.com.au

 

 

Offsite Article: The New War on Porn...


Link Here3rd January 2021
Full story: Pornhub...An ongoing target of censors
How Moral Crusaders, Mainstream Media and Politicians Are Gunning for XXX

See article from xbiz.com

 

 

Offsite Article: Study: pornography does not cause sexual violence, despite what many believe...


Link Here 31st December 2020
A large-scale meta-analysis aims to disprove the notion that pornography consumption causes sexual aggression and violence. By Jaimee Bell

See article from bigthink.com

 

 

Offsite Article: How Pornhub's video purge is hurting sex workers...


Link Here27th December 2020
Full story: Pornhub...An ongoing target of censors
Following the seismic exposure of Pornhub for hosting non-consensual and abusive content, credit card companies have cut ties with the adult site, but creators who rely on the platform have been left in the lurch

See article from dazeddigital.com

 

 

Anti-hubbers...

Senators propose far reaching anti-pornHub censorship law


Link Here21st December 2020
Full story: Pornhub...An ongoing target of censors
U.S. Senators Ben Sasse and Jeff Merkley have introduced a bipartisan bill calling for extensive new censorship rules for  adult websites.

Dubbed the Stop Internet Sexual Explotation Act , the bill was prompted, according to an announcement from Sasse's office, by reports of how videos and photos are uploaded to websites like Pornhub without the consent of individuals who appear in them. In particular the bill seems triggered by charges filed against Pornhub parent company MindGeek alleging that the company knowingly hosted and profited from the non-consensual videos over which website GirlsDoPorn.

Among the censorship measures the new bill seeks to enact are:
  • Require any user uploading a video to the platform to verify their identiry
  • Require any user uploading a video to the platform also upload a signed consent form from every individual appearing in the video
  • Creating a private right of action against an uploader who uploads a pornographic image without the consent of an individual featured in the image
  • Requiring platforms hosting pornography to include a notice or banner on the website instructing how an individual can request removal of a video if an individual has not consented to it being uploaded on the platform
  • Prohibiting video downloads from these platforms, to be in place within three months of enactment of the legislation
  • Requiring platforms hosting pornography to offer a 24-hour hotline staffed by the platform, for individuals who contact the hotline to request removal of a video that has been distributed without their consent
  • Requiring removal of flagged videos within two hours of such a request
  • Requiring platforms to use software to block a video from being re-uploaded after its removal, which must be in place within six months of enactment of the legislations
  • Directing the Federal Trade Commission to enforce violations of these requirements
  • Creating a database of individuals who have indicated they do not consent, which must be checked before new content can be uploaded to platforms
  • Instructing the Department of Justice to promulgate rules on where this database should be housed, and determine how to connect victims with services, to include couseling and casework
  • Establishing that failure to comply with this requirement will result in a civil penalty to the platform, with proceeds going towards victim services

 

 

Softcore drama...

The Pamela Anderson ad Tommy Lee sex tape is to be a new internet TV drama.


Link Here18th December 2020
Pamela Anderson's infamous sex tape with Tommy Lee is now at the heart of a proposed limited series coming to the Hulu internet TV srvice.

Lily James ( Pride and Prejudice and Zombies ) and Sebastian Stan ( Captain America: The Winter Soldier ) are set to play the former Baywatch star and Mötley Crüe drummer in the new series dubbed Pam & Tommy .

It is reported that Pamela Anderson and Tommy Law are aware of the project but are not involved in the production

 

 

Harming the internet...

The Government outlines its final plans to introduce new and wide ranging internet censorship laws


Link Here15th December 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media

Digital Secretary Oliver Dowden and Home Secretary Priti Patel have announced the government's final decisions on new internet censorships laws.

  • New rules to be introduced for nearly all tech firms that allow users to post their own content or interact

  • Firms failing to protect people face fines of up to ten per cent of turnover or the blocking of their sites and the government will reserve the power for senior managers to be held liable

  • Popular platforms to be held responsible for tackling both legal and illegal harms

  • All platforms will have a duty of care to protect children using their services

  • Laws will not affect articles and comments sections on news websites, and there will be additional measures to protect free speech

The full government response to the Online Harms White Paper consultation sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users. The safety of children is at the heart of the measures.

Social media sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The Government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal.

Tech platforms will need to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography. This will help make sure future generations enjoy the full benefits of the internet with better protections in place to reduce the risk of harm.

The most popular social media sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice.

Ofcom is now confirmed as the regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK.

The legislation includes provisions to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously - for example, if they do not respond fully, accurately and in a timely manner to information requests from Ofcom. This power would be introduced by Parliament via secondary legislation, and reserving the power to compel compliance follows similar approaches in other sectors such as financial services regulation.

The government plans to bring the laws forward in an Online Safety Bill next year and set the global standard for proportionate yet effective regulation. This will safeguard people's rights online and empower adult users to keep themselves safe while preventing companies arbitrarily removing content. It will defend freedom of expression and the invaluable role of a free press, while driving a new wave of digital growth by building trust in technology businesses.

Scope

The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly interact with others online.

It includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations.

The legislation will include safeguards for freedom of expression and pluralism online - protecting people's rights to participate in society and engage in robust debate.

Online journalism from news publishers' websites will be exempt, as will reader comments on such sites. Specific measures will be included in the legislation to make sure journalistic content is still protected when it is reshared on social media platforms.

Categorised approach

Companies will have different responsibilities for different categories of content and activity, under an approach focused on the sites, apps and platforms where the risk of harm is greatest.

All companies will need to take appropriate steps to address illegal content and activity such as terrorism and child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them. This could be, for example, by using tools that give age assurance to ensure children are not accessing platforms which are not suitable for them.

The government will make clear in the legislation the harmful content and activity that the regulations will cover and Ofcom will set out how companies can fulfil their duty of care in codes of practice.

A small group of companies with the largest online presences and high-risk features, likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1.

These companies will need to assess the risk of legal content or activity on their services with "a reasonably foreseeable risk of causing significant physical or psychological harm to adults". They will then need to make clear what type of "legal but harmful" content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently.

All companies will need mechanisms so people can easily report harmful content or activity while also being able to appeal the takedown of content. Category 1 companies will be required to publish transparency reports about the steps they are taking to tackle online harms.

Examples of Category 2 services are platforms which host dating services or pornography and private messaging apps. Less than three per cent of UK businesses will fall within the scope of the legislation and the vast majority of companies will be Category 2 services.

Exemptions

Financial harms will be excluded from this framework, including fraud and the sale of unsafe goods. This will mean the regulations are clear and manageable for businesses, focus action where there will be most impact, and avoid duplicating existing regulation.

Where appropriate, lower-risk services will be exempt from the duty of care to avoid putting disproportionate demands on businesses. This includes exemptions for retailers who only offer product and service reviews and software used internally by businesses. Email services will also be exempt.

Some types of advertising, including organic and influencer adverts that appear on social media platforms, will be in scope. Adverts placed on an in-scope service through a direct contract between an advertiser and an advertising service, such as Facebook or Google Ads, will be exempt because this is covered by existing regulation.

Private communications

The response will set out how the regulations will apply to communication channels and services where users expect a greater degree of privacy - for example online instant messaging services and closed social media groups which are still in scope.

Companies will need to consider the impact on user privacy and that they understand how company systems and processes affect people's privacy, but firms could, for example, be required to make services safer by design by limiting the ability for anonymous adults to contact children.

Given the severity of the threat on these services, the legislation will enable Ofcom to require companies to use technology to monitor, identify and remove tightly defined categories of illegal material relating to child sexual exploitation and abuse. Recognising the potential impact on user privacy, the government will ensure this is only used as a last resort where alternative measures are not working. It will be subject to stringent legal safeguards to protect user rights.

 

 

Censorship hubbub...

Pornhub takes down the majority of its videos, those from unverified uploaders


Link Here15th December 2020
Full story: Pornhub...An ongoing target of censors
Pornhub explained in a blog post:

At Pornhub, the safety of our community is our top priority. Last week, we enacted the most comprehensive safeguards in user-generated platform history. We banned unverified uploaders from posting new content, eliminated downloads, and partnered with dozens of non-profit organizations, among other major policy changes (please read here for more details).

As part of our policy to ban unverified uploaders, we have now also suspended all previously uploaded content that was not created by content partners or members of the Model Program. This means every piece of Pornhub content is from verified uploaders, a requirement that platforms like Facebook, Instagram, TikTok, YouTube, Snapchat and Twitter have yet to institute.

Leading non-profit organizations and advocacy groups acknowledge our efforts to date at combating illegal content have been effective. Over the last three years, Facebook self-reported 84 million instances of child sexual abuse material. During that same period, the independent, third-party Internet Watch Foundation reported 118 incidents on Pornhub. That is still 118 too many, which is why we are committed to taking every necessary action.

It is clear that Pornhub is being targeted not because of our policies and how we compare to our peers, but because we are an adult content platform. The two groups that have spearheaded the campaign against our company are the National Center on Sexual Exploitation (formerly known as Morality in Media) and Exodus Cry/TraffickingHub. These are organizations dedicated to abolishing pornography, banning material they claim is obscene, and shutting down commercial sex work. These are the same forces that have spent 50 years demonizing Playboy, the National Endowment for the Arts, sex education, LGBTQ rights, women's rights, and even the American Library Association. Today, it happens to be Pornhub.

In today's world, all social media platforms share the responsibility to combat illegal material. Solutions must be driven by real facts and real experts. We hope we have demonstrated our dedication to leading by example.

 

 

Updated: Best not to trust internet payment companies...

They can control where you are allowed to spend your money, and in this case at Pornhub


Link Here13th December 2020
Full story: Pornhub...An ongoing target of censors
Payments giant Mastercard is considering banning people from spending their money at Pornhub.

Mastercard  is reviewing its business with pornography platform Pornhub, following a campaign against the website being highlighted by the New York Times.

Mastercard responded after reporter Nicholas Kristof said he didn't see why search engines, banks or credit-card companies should bolster Pornhub.

Pornhub is free to use but users can pay £9.99 a month for higher-quality video streams and advert-free and exclusive content.

Update: Visa too

13th December 2020. See article from avn.com

Visa followed Mastercard's lead and said that it too won't allow Pornhub users to use their credit cards to make charges on the adult content site. Visa said in a statement:
We are instructing the financial institutions who serve MindGeek to suspend processing of payments through the Visa network.

And according to Bloomberg.com, Mastercard said it's continuing to investigate potential illegal content on other websites, most likely XVideos.

 

 

Harming the internet...

Ofcom consults about its plans to tool up for its new roles as the UK internet censor


Link Here11th December 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Ofcom has opened a consultation on its plan to get ready for its likely role as the UK internet censor under the Governments Online Harms legislation. Ofcom writes

We have today published our plan of work for 2021/22. This consultation sets out our goals for the next financial year, and how we plan to achieve them.

We are consulting on this plan of work to encourage discussion with companies, governments and the public.

As part of the Plan of Work publication, we are also holding some virtual events to invite feedback on our proposed plan. These free events are open to everyone, and offer an opportunity to comment and ask questions.

The consultation ends on 5th February 2021.

The Key areas referencing internet censorship are:

Preparing to regulate online harms

3.26 The UK Government has given Ofcom new duties as the regulator for UK -established video - sharing platforms (VSPs) through the transposition of the European -wide Audiovisual Media Services Directive. VSPs are a type of online video service where users can upload and share vide os with members of the public, such as You Tube and TikTok. Ofcom will not be responsible for regulating all VSPs as our duties only apply to services established in the UK and as such , we anticipate that a relatively small number of services fall within our jurisdiction. Under the new regulations, which came into force on 1 November 2020, VSPs must have appropriate measures in place to protect children from potentially harmful content and all users from criminal content and incitement to hatred and violence. VSPs will also need to make sure certain advertising standards are met.

3.27 As well as appointing Ofcom as the regulator of UK- established VSPs the Government has announced that it is minded to appoint Ofcom as the future regulator responsible for protecting users from harmful online content. With this in mind we are undertaking the following work :

  • Video-sharing platforms regulation . We have issued a short guide to the new requirements. 22 On 19 November 2020 we issued draft scope and jurisdiction guidance for consultation to help providers self -assess whether they need to notify to Ofcom as a VSP under the statutory rules from April 2021. 23 We will also consult in early 2021 on further guidance on the risk of harms and appropriate measures as well as proposals for a co-regulatory relationship with the Advertising Standards Authority (ASA) with regards to VSP advertising. We intend to issue final versions of the guidance in summer 2021.

  • Preparing for the online harms regime. The UK Government has set out that it intends to put in place a regime to keep people safe online. In February 2020 it published an initial response to the 2019 White Paper24 setting out how it intends to develop the regime which stated that it was minded to appoint Ofcom as the future regulator of online harms. If confirmed, these proposed new responsibilities would constitute a significant expansion to our remit, and preparing for them would be a major area of focus in 2021/22. We will continue to provide technical advice to the UK Government on its policy development process, and we will engage with Parliament as it considers legislative proposals.

3.29 We will continue work to deepen our understanding of online harms through a range of work:

  • Our Making Sense of Media programme. This programme will continue to provide insights on the needs, behaviours and attitudes of people online. Our other initiatives to research online markets and technologies will further our understanding of how online harms can be mitigated

  • Stepping up our collaboration with other regulators. As discussed in the Developing strong partnerships section, we will continue our joint work through the Digital Regulators Cooperation Forum and strengthen our collaboration with regulators around the world who are also considering online harms.

  • Understanding VSPs . The introduction of regulation to UK-established VSPs will provide a solid foundation to inform and develop the broader future online harms regulatory framework. This interim regime is more limited in terms of the number of regulated companies and will cover a narrower range of harms compared to the online harms white paper proposals. However, should Ofcom be confirmed as the regulator, through our work on VSPs we will develop on-the-job experience working with newly regulated online services, developing the evidence base of online harm, and building our internal skills and expertise.

 

 

Upping its game...

Pornhub responds to criticism and now requires full identification of uploaders


Link Here9th December 2020
Full story: Pornhub...An ongoing target of censors
Pornhub has responded to criticism by a significant change of rules to require formal identification of uploaders. Pornhub explains:

Today, we are taking major steps to further protect our community. Going forward, we will only allow properly identified users to upload content. We have banned downloads. We have made some key expansions to our moderation process, and we recently launched a Trusted Flagger Program with dozens of non-profit organizations. Earlier this year, we also partnered with the National Center for Missing & Exploited Children, and next year we will issue our first transparency report. Full details on our expanded policies can be found below.

1. Verified Uploaders Only

Effective immediately, only content partners and people within the Model Program will be able to upload content to Pornhub. In the new year, we will implement a verification process so that any user can upload content upon successful completion of identification protocol.

2. Banning Downloads

Effective immediately, we have removed the ability for users to download content from Pornhub, with the exception of paid downloads within the verified Model Program. In tandem with our fingerprinting technology, this will mitigate the ability for content already removed from the platform to be able to return.

3. Expanded Moderation

We have worked to create comprehensive measures that help protect our community from illegal content. In recent months we deployed an additional layer of moderation. The newly established "Red Team" will be dedicated solely to self-auditing the platform for potentially illegal material. The Red Team provides an extra layer of protection on top of the existing protocol, proactively sweeping content already uploaded for potential violations and identifying any breakdowns in the moderation process that could allow a piece of content that violates the Terms of Service. Additionally, while the list of banned keywords on Pornhub is already extensive, we will continue to identify additional keywords for removal on an ongoing basis. We will also regularly monitor search terms within the platform for increases in phrasings that attempt to bypass the safeguards in place. Pornhub's current content moderation includes an extensive team of human moderators dedicated to manually reviewing every single upload, a thorough system for flagging, reviewing and removing illegal material, robust parental controls, and utilization of a variety of automated detection technologies. These technologies include:

  • CSAI Match, YouTube's proprietary technology for combating Child Sexual Abuse Imagery online

  • Content Safety API, Google's artificial intelligence tool that helps detect illegal imagery

  • PhotoDNA, Microsoft's technology that aids in finding and removing known images of child exploitation

  • Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against banned videos being re-uploaded to the platform.

4. Trusted Flagger Program

We recently launched a Trusted Flagger Program, a new initiative empowering non-profit partners to alert us of content they think may violate our Terms of Service. The Trusted Flagger Program consists of more than 40 leading non-profit organizations in the space of internet and child safety. Our partners have a direct line of access to our moderation team, and any content identified by a Trusted Flagger is immediately disabled. Partners include: Cyber Civil Rights Initiative (United States of America), National Center for Missing & Exploited Children (United States of America), Internet Watch Foundation (United Kingdom), Stopline (Austria), Child Focus (Belgium), Safenet (Bulgaria), Te Protejo Hotline - I Protect You Hotline (Colombia), CZ.NIC - Stop Online (Czech Republic ), Point de Contact (France), Eco-Association of the Internet Industry (Germany), Safeline (Greece), Save the Children (Iceland), Latvian Internet Association (Latvia), Meldpunt Kinderporno - Child Pornography Reporting Point (Netherlands), Centre for Safer Internet Slovenia (Slovenia), FPB Hotline - Film and Publication Board (South Africa), ECPAT (Sweden), ECPAT (Taiwan).

5. NCMEC Partnership

Last year, we voluntarily partnered with the National Center for Missing & Exploited Children (NCMEC) in order to transparently report and limit incidents of CSAM on our platform. In early 2021, NCMEC will release our total number of reported CSAM incidents alongside numbers from other major social and content platforms. We will also continue to work with law enforcement globally to report and curb any issues of illegal content.

6. Transparency Report

In 2021, we will release a Transparency Report detailing our content moderation results from 2020. This will identify not just the full number of reports filed with NCMEC, but also other key details related to the trust and safety of our platform. Much like Facebook, Instagram, Twitter and other tech platforms, Pornhub seeks to be fully transparent about the content that should and should not appear on the platform. This will make us the only adult content platform to release such a report.

7. Independent Review

As part of our commitment, in April 2020 we hired the law firm of Kaplan Hecker & Fink LLP to conduct an independent review of our content compliance function, with a focus on meeting legal standards and eliminating all non-consensual content, CSAM and any other content uploaded without the meaningful consent of all parties. We requested that the goal of the independent review be to identify the requisite steps to achieve a "best-in-class" content compliance program that sets the standard for the technology industry. Kaplan Hecker & Fink LLP is continuing its review, but has already identified and categorized a comprehensive inventory of remedial recommendations, supported by dozens of additional sub-recommendations, in addition to the steps identified above, based on an evaluation and assessment of our current policies and practices. Kaplan Hecker & Fink LLP is soliciting information to assist with its review and in developing recommendations regarding our compliance policies and procedures.

 

 

Offsite Article: Miserable Times...


Link Here6th December 2020
Full story: Pornhub...An ongoing target of censors
The New York Times calls for the censorship of Pornhub

See article from xbiz.com

 

 

Shared video censorship...

House of Lords approves adoption of the EU's internet video sharing censorship laws into post Brexit UK law


Link Here29th November 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
The House of Lords approved a statutory instrument that adopts the EU's Audio Visual Media Services Directive into post-Brexit UK law. This law describes state censorship requirements for internet video sharing platforms.

The law change was debated on 27th November 2020 with the government introducing the law as follows:

Baroness Barran, The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My Lords, I am pleased to introduce this instrument, laid in both Houses on 15 October, which is being made under the European Union (Withdrawal) Act 2018. These regulations remedy certain failures of retained EU law arising from the withdrawal of the United Kingdom from the EU. This instrument seeks to maintain, but not expand, Ofcom's remit to regulate video-sharing platform services. This intervention is necessary to ensure the law remains operable beyond the end of the transition period.

The EU's audiovisual media services directive, known as the AVMS directive, governs the co-ordination of national legislation on audio-visual media services. The AVMS directive was initially implemented into UK law in 2010, primarily by way of amendments to UK broadcasting legislation. The directive was subsequently revised in 2018. The UK Audiovisual Media Services Regulations 2020, which transposed the revised AVMS directive, were made and laid in Parliament on 30 September. Those regulations came into force on 1 November and introduced, for the first time, rules for video-sharing platform services. The Government have appointed Ofcom as the regulator for these services. The new rules ensure that platforms falling within UK jurisdiction have appropriate systems and processes to protect the public, including minors, from illegal and harmful material.

There were three key requirements placed on video-sharing platforms under the regulations. These were: to take appropriate measures to protect minors under 18 from harmful content, to take appropriate measures to protect the general public from harmful and certain illegal content, and to introduce standards around advertising. I also draw the attention of the House to the report from the Secondary Legislation Scrutiny Committee considering this instrument, and I thank its members for their work.

I will now address the committee's concerns regarding jurisdiction. The AVMS directive sets out technical rules governing when a platform falls within a country's jurisdiction. First, there must be a physical presence, or a group undertaking, of the platform in the country. Where there is a physical presence in more than one country, jurisdiction is decided on the basis of factors such as whether the platform is established in that country, whether the platform's main economic activity is centred in that country, and the hierarchy of group undertakings as set out by the directive.

Under the revised AVMS directive, each EU member state and the UK is responsible for regulating only the video-sharing platforms that fall within its jurisdiction. There will be only one country that has jurisdiction for each platform at any one time. However, if a platform has no physical presence in any country covered by the AVMS directive, then no country will have jurisdiction over it, even if the platform provides services in those countries.

Through this instrument, we are seeking to maintain the same position for Ofcom's remit beyond the end of the transition period. This position allows Ofcom to regulate video-sharing platforms established in the UK and additionally regulate platforms that have a physical presence in the UK but not in any other country covered by the AVMS directive. Although Ofcom's remit will not be extended to include platforms established elsewhere in the EU, we believe UK users will indirectly benefit from the EU's regulation of platforms under the AVMS directive. The regulation under this regime is systems regulation, not content regulation. We therefore expect that as platforms based outside of the UK will set up and invest in systems to comply with the AVMS regulations, it is probable that these same systems will also be introduced for their UK subsidiaries.

In the absence of this instrument, Ofcom would no longer be able to regulate any video-sharing platforms. This would result in an unacceptable regulatory gap and a lack of protection for UK users using these services. Our approach also mitigates the small risk that a video- sharing platform offering services to countries covered by the AVMS directive, but not the UK, would establish itself in the UK in order to circumvent EU law.

While we recognise that most children have a positive experience online, the reality is that the impact of harmful content and activity online can be particularly damaging for children. Over three-quarters of UK adults also express a deep concern about the internet. The UK is one of only three countries to have transposed the revised directive thus far, evidencing our commitment to protecting users online.

These regulations also pave the way for the upcoming online harms regulatory regime. Given that the online harms regulatory framework shares broadly the same objectives as the video-sharing platform regime, it is the Government's intention that the regulation of video-sharing platforms in the UK will be superseded by the online harms legislation, once the latter comes into force. Further details on the plans for online harms regulation will be set out in the full government response to the consultation on the Online Harms White Paper, which is due to be published later this year, with draft legislation ready in early 2021. With that, I beg to move.

 

 

Warning: Porn can be harmful if viewed by Utah moralists...

PornHub and other tube websites add trigger warning about porn as required by Utah law


Link Here 26th November 2020
Full story: US politicans and porn harms...US states claim porn to be a public health hazard
Some porn websites are beginning to comply with a nonsense new Utah law requiring warning labels be attached to adult-oriented materials.

At least three major tube sites, Pornhub, XTube and RedTube, have begun attaching the opt-in notification for visitors, which states that Utah believes pornographic materials can be harmful if viewed by minors.

This trigger warning is a response to Utah state law sponsored by Utah House Representative Brady Brammer earlier this year. The bill started life as intending to restrict porn in the state but was watered down until it ended up as a trivial warning requirement.

 

 

'I'm not on a crusade against porn. I just want to protect kids'...

Anti-porn crusader introduces Canadian private members bill to require strict age verification for porn sites


Link Here 23rd November 2020
Full story: Internet Censorship in Canada...Proposal for opt in intenet blocking
Independent Quebec Senator Julie Miville-Dechêne is calling for censorship of online porn through new legislation that would force porn sites to verify the ages of all users.

Miville-Dechêne has introduced a bill, S-203, that would make porn sites like the Canadian-owned PornHub criminally liable for failing to check a user's age before they browse.

Miville-Dechêne, who was appointed by Prime Minister Justin Trudeau in 2018, spouted anti porn rhetoric saying that children and teenagers must be protected against graphic material that she said can pollute their minds. She continued:

I'm not on a crusade against porn. I just want to protect kids from porn that is shown widely on these websites that is not at all the soft kind of stuff. It's hardcore, it's tough and it's violent.

Her bill would make it a Criminal Code offence to make sexually explicit material available to a minor on the internet. A first offence would be punishable by a fine of not more than $10,000 for an individual and $250,000 for a corporation. Fines for subsequent offences would be more substantial.




 2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   Latest 


 


 
Gay News

Internet Porn News

Magazine News

Satellite X News

Sex Aware

Sex Toys News
 

UK P4P News

UK Sex News

US P4P News

US Sex News

World P4P News

World Sex News
 


melonfarmers icon

Home

Index

Links

Email

Shop
 


US

World

Media

Nutters

Liberty
 

Film Cuts

Cutting Edge

Info

Sex News

Sex+Shopping
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys