In a survey more about net neutrality than porn censorship, MoneySupermarket noted:
We conducted a survey of over 2,000 Brits on this and it seems that if an ISP decided to block sites, it could result in increasing numbers of Brits switching - 64 per cent of Brits would be likely to switch ISP if they put blocks in place
In reality, this means millions could be considering a switch as nearly six million having tried to access a site that was blocked in the last week - nearly one in 10 across the country.
It's an issue even more pertinent for those aged 18 to 34, with nearly half (45 per cent) having tried to access a site that was blocked at some point.
While ISPs might block sites for various reasons, a quarter of Brits said they would switch ISP if they were blocked from viewing adult sites - with those living with partners the most likely to do so!
Now switching ISPs isn't going to help much if the BBFC, the government appointed porn censor, has dictated that all ISPs block porn sites. But maybe these 25% of internet users will take up alternatives such as subscribing to a VPN service.
The BBFC has made a few changes to its approach since the rather ropey document published prior to the BBFC's public consultation. In general the BBFC seems a little more pragmatic about trying to get adult porn users to buy into the age
verification way of thinking. The BBFC seems supportive of the anonymously bought porn access card from the local store, and has taken a strong stance against age verification providers who reprehensibly want to record people's porn browsing,
claiming a need to provide an audit trail.
The BBFC has also decided to offer a service to certify age verification providers in the way that they protect people's data. This is again probably targeted at making adult porn users a bit more confident in handing over ID.
The BBFC tone is a little bit more acknowledging of people's privacy concerns, but it's the government's law being implemented by the BBFC, that allows the recipients of the data to use it more or less how they like. Once you tick the 'take it or
leave it' consent box allowing the AV provider 'to make your user experience better' then they can do what they like with your data (although GDPR does kindly let you later withdraw that consent and see what they have got on you).
Another theme that runs through the site is a rather ironic acceptance that, for all the devastation that will befall the UK porn industry, for all the lives ruined by people having their porn viewing outed, for all the lives ruined by fraud and
identity theft, that somehow the regime is only about stopping young children 'stumbling on porn'... because the older, more determined, children will still know how to find it anyway.
So the BBFC has laid out its stall, and its a little more conciliatory to porn users, but I for one will never hand over any ID data to anyone connected with a servicing porn websites. I suspect that many others will feel the same. If you can't
trust the biggest companies in the business with your data, what hope is there for anyone else.
There's no word yet on when all this will come into force, but the schedule seems to be 3 months after the BBFC scheme has been approved by Parliament. This approval seems scheduled to be debated in Parliament in early November, eg on 5th
November there will be a House of Lords session:
Implementation by the British Board of Film Classification of age-verifications to prevent children accessing pornographic websites 203 Baroness Benjamin Oral questions
So the earliest it could come into force is about mid February.
The BBFC has published its Age Verification Guidance document that will underipin the implementation of internet porn censorship in the UK.
Perhaps a key section is:
5. The criteria against which the BBFC will assess that an age-verification arrangement meets the requirement under section 14(1) to secure that pornographic material is not normally accessible by those under 18 are set out below:
a. an effective control mechanism at the point of registration or access to pornographic content by the end-user which verifies that the user is aged 18 or over at the point of registration or access
b use of age-verification data that cannot be reasonably known by another person, without theft or fraudulent use of data or identification documents nor readily obtained or predicted by another person
c. a requirement that either a user age-verify each visit or access is restricted by controls, manual or electronic, such as, but not limited to, password or personal identification numbers. A consumer must be logged out by default unless they
positively opt-in for their log in information to be remembered
d. the inclusion of measures which authenticate age-verification data and measures which are effective at preventing use by non-human operators including algorithms
It is fascinating as to why the BBFC feels that bots need to be banned, perhaps they need to be 18 years old too, before they can access porn. I am not sure if porn sites will appreciate Goggle-bot being banned from their sites. I love the idea
that the word 'algorithms' has been elevated to some sort of living entity.
It all smacks of being written by people who don't know what they are talking about.
In a quick read I thought the following paragraph was important:
9. In the interests of data minimisation and data protection, the BBFC does not require that age-verification arrangements maintain data for the purposes of providing an audit trail in order to meet the requirements of the act.
It rather suggests that the BBFC pragmatically accept that convenience and buy-in from porn-users is more important than making life dangerous for everybody, just n case a few teenagers get hold of an access code.
The British Board of Film Classification was designated as the age-verification regulator under Part 3 of the Digital Economy Act on 21 February 2018. The BBFC launched its consultation on the draft Guidance on Age-verification Arrangements and
draft Guidance on Ancillary Service Providers on 26 March 2018. The consultation was available on the BBFC's website and asked for comments on the technical aspects on how the BBFC intends to approach its role and functions as the
age-verification regulator. The consultation ran for 4 weeks and closed on 23 April 2018, although late submissions were accepted until 8 May 2018.
There were a total of 624 responses to the consultation. The vast majority of those (584) were submitted by individuals, with 40 submitted by organisations. 623 responses were received via email, and one was received by post. Where express
consent has been given for their publication, the BBFC has published responses in a separate document. Response summaries from key stakeholders are in part 4 of this document.
Responses from stakeholders such as children's charities, age-verification providers and internet service providers were broadly supportive of the BBFC's approach and age-verification standards. Some responses from these groups asked for
clarification to some points. The BBFC has made a number of amendments to the guidance as a result. These are outlined in chapter 2 of this document. Responses to questions raised are covered in chapter 3 of this document.
A significant number of responses, particularly from individuals and campaign groups, raised concerns about the introduction of age-verification, and set out objections to the legislation and regulatory regime in principle. Issues included
infringement of freedom of expression, censorship, problematic enforcement powers and an unmanageable scale of operation. The government's consultation on age-verification in 2016 addressed many of these issues of principle. More information
about why age-verification has been introduced, and the considerations given to the regulatory framework and enforcement powers can be found in the 2016 consultation response by the Department for Digital Culture Media and Sport1.
New rules on audiovisual media services will apply to broadcasters, and also to video-on-demand and video-sharing platforms
MEPs voted on updated rules on audiovisual media services covering children protection, stricter rules on advertising, and a requirement 30% European content in video-on-demand.
Following the final vote on this agreement, the revised legislation will apply to broadcasters, but also to video-on-demand and video-sharing platforms, such as Netflix, YouTube or Facebook, as well as to live streaming on video-sharing
The updated rules will ensure:
Enhanced protection of minors from violence, hatred, terrorism and harmful advertising
Audiovisual media services providers should have appropriate measures to combat content inciting violence, hatred and terrorism, while gratuitous violence and pornography will be subject to the strictest rules. Video-sharing platforms will now be
responsible for reacting quickly when content is reported or flagged by users as harmful.
The legislation does not include any automatic filtering of uploaded content, but, at the request of the Parliament, platforms need to create a transparent, easy-to-use and effective mechanism to allow users to report or flag content.
The new law includes strict rules on advertising, product placement in children's TV programmes and content available on video-on-demand platforms. EP negotiators also secured a personal data protection mechanism for children, imposing measures
to ensure that data collected by audiovisual media providers are not processed for commercial use, including for profiling and behaviourally targeted advertising.
Redefined limits of advertising
Under the new rules, advertising can take up a maximum of 20% of the daily broadcasting period between 6.00 and 18.00, giving the broadcaster the flexibility to adjust their advertising periods. A prime-time window between 18:00 and 0:00 was also
set out, during which advertising will only be allowed to take up a maximum of 20% of broadcasting time.
30% of European content on the video-on-demand platforms' catalogues
In order to support the cultural diversity of the European audiovisual sector, MEPs ensured that 30% of content in the video-on-demand platforms' catalogues should be European.
Video-on-demand platforms are also asked to contribute to the development of European audiovisual productions, either by investing directly in content or by contributing to national funds. The level of contribution in each country should be
proportional to their on-demand revenues in that country (member states where they are established or member states where they target the audience wholly or mostly).
The legislation also includes provisions regarding accessibility, integrity of a broadcaster's signal, strengthening regulatory authorities and promoting media competences.
The deal still needs to be formally approved by the Council of EU ministers before the revised law can enter into force. Member States have 21 months after its entry into force to transpose the new rules into national legislation.
The text was adopted by 452 votes against 132, with 65 abstentions.
A new section has been added to the AVMS rules re censorship
Member States shall take appropriate measures to ensure that audiovisual media services provided by media service providers under their jurisdiction which may impair the physical, mental or moral development of minors are only made available
in such a way as to ensure that minors will not normally hear or see them. Such measures may include selecting the time of the broadcast, age verification tools or other technical measures. They shall be proportionate to the potential harm of
the programme. The most harmful content, such as gratuitous violence and pornography, shall be subject to the strictest measures.
Personal data of minors collected or otherwise generated by media service providers pursuant to paragraph 1 shall not be processed for commercial purposes, such as direct marketing, profiling and behaviourally targeted advertising.
Member States shall ensure that media service providers provide sufficient information to viewers about content which may impair the physical, mental or moral development of minors. For this purpose, media service providers shall use a system
describing the potentially harmful nature of the content of an audiovisual media service. For the implementation of this paragraph, Member States shall encourage the use of co - regulation as provided for in Article 4a(1).
The Commission shall encourage media service providers to exchange best practices on co - regulatory codes of conduct . Member States and the Commission may foster self - regulation, for the purposes of this Article, through Union codes of
conduct as referred to in Article 4a(2).
Article 4a suggests possible organisation of the censors assigned to the task, eg state censors, state controlled organisations eg Ofcom, or nominally state controlled co-regulators like the defunct ATVOD.
Article 4a(3). notes that censorial countries like the UK are free to add further censorship rules of their own:
Member States shall remain free to require media service providers under their jurisdiction to comply with more detailed or stricter rules in compliance with this Directive and Union law, including where their national independent regulatory
authorities or bodies conclude that any code of conduct or parts thereof h ave proven not to be sufficiently effective. Member States shall report such rules to the Commission without undue delay. ;