| |
|
|
|
 | 6th May 2021
|
|
|
German academic publishes a survey confirming that 16 and 17 year olds are keen porn viewers and suggests that they will surely find ways to work around age verification See
article from onlinelibrary.wiley.com |
| |
Government notes that porn websites without user comments or uploads will not be within the censorship regime of the upcoming Online Safety Bill
|
|
|
 | 27th March
2021
|
|
| See article from questions-statements.parliament.uk
|
Written Question, answered on 24 March 2021 Baroness Grender Liberal Democrat Life peer Lords To ask Her Majesty's Government which commercial pornography companies will be in scope of the
Online Safety Bill; and whether commercial pornography websites which
do not host user-generated content, or allow private user communication, will also be in scope.
Baroness Barran Conservative The government is committed to ensuring children are protected from accessing online pornography through the new online safety framework. Where
pornography sites host user-generated content or facilitate online user interaction such as video and image sharing, commenting and live streaming, they will be subject to the new duty of care. Commercial pornography sites which allow private user to
user communication will be in scope. Where commercial pornography sites do not have user-generated functionality they will not be in scope. The online safety regime will capture both the most visited pornography sites and pornography on social media,
therefore covering the majority of sites where children are most likely to be exposed to pornography. We expect companies to use age assurance or age verification technologies to prevent children from accessing services which pose
the highest risk of harm to children, such as online pornography. We are working closely with stakeholders across industry to establish the right conditions for the market to deliver age assurance and age verification technical solutions ahead of the
legislative requirements coming into force.
|
| |
Floella Benjamin attempts to resuscitate internet porn age verification in a Domestic Abuse Bill
|
|
|
 |
11th February 2021
|
|
| See Government statement about age verification (11th
January 2021) from questions-statements.parliament.uk See
attempt to resuscitate porn age verification in the
Domestic Abuse Bill (10th February 2021) from hansard.parliament.uk |
Campaigners for the revival of deeply flawed and one sided age verification for porn scheme have been continuing their efforts to revive it ever since it was abandoned by the Government in October 2019. The Government was asked about the possibility
of restoring it in January 2021 in the House of Commons. Caroline Dinenage responded for the government: The Government announced in October 2019 that it will not commence the age verification provisions of Part 3 of
the Digital Economy Act 2017 and instead deliver these protections through our wider online harms regulatory proposals. Under our online harms proposals, we expect companies to use age assurance or age verification technologies to
prevent children from accessing services which pose the highest risk of harm to children, such as online pornography. The online harms regime will capture both the most visited pornography sites and pornography on social media, therefore covering the
vast majority of sites where children are most likely to be exposed to pornography. Taken together we expect this to bring into scope more online pornography currently accessible to children than would have been covered by the narrower scope of the
Digital Economy Act. We would encourage companies to take steps ahead of the legislation to protect children from harmful and age inappropriate content online, including online pornography. We are working closely with stakeholders
across industry to establish the right conditions for the market to deliver age assurance and age verification technical solutions ahead of the legislative requirements coming into force. In addition, Regulations transposing the
revised Audiovisual Media Services Directive came into force on 1 November 2020 which require UK-established video sharing platforms to take appropriate measures to protect minors from harmful content. The Regulations require that the most harmful
content is subject to the strongest protections, such as age assurance or more technical measures. Ofcom, as the regulatory authority, may take robust enforcement action against video sharing platforms which do not adopt appropriate measures.
Now during the passage of the Domestic Abuse in the House of Lords, Floella Benjamin attempted to revive the age verification requirement by proposing the following amendment: Insert the following new
Clause -- Impact of online pornography on domestic abuse (1) Within three months of the day on which this Act is passed, the Secretary of State must commission a person appointed by the
Secretary of State to investigate the impact of access to online pornography by children on domestic abuse. (2) Within three months of their appointment, the appointed person must publish a report on the investigation which may
include recommendations for the Secretary of State. (3) As part of the investigation, the appointed person must consider the extent to which the implementation of Part 3 of the Digital Economy Act 2017 (online pornography) would
prevent domestic abuse, and may make recommendations to the Secretary of State accordingly. (4) Within three months of receiving the report, the Secretary of State must publish a response to the recommendations of the appointed
person. (5) If the appointed person recommends that Part 3 of the Digital Economy Act 2017 should be commenced, the Secretary of State must appoint a day for the coming into force of that Part under section 118(6) of the Act
within the timeframe recommended by the appointed person.
Member's explanatory statement This amendment would require an investigation into any link between online pornography and
domestic abuse with a view to implementing recommendations to bring into effect the age verification regime in the Digital Economy Act 2017 as a means of preventing domestic abuse.
Floella Benjamin made a long speech supporting the
censorship measure and was supported by a number of peers. Of course they all argued only from the 'think of the children' side of the argument and not one of them mentioned trashed adult businesses and the risk to porn viewers of being outed, scammed,
blackmailed etc. See Floella Benjamin's
speech from hansard.parliament.uk |
| |
The Government outlines its final plans to introduce new and wide ranging internet censorship laws
|
|
|
 | 15th December 2020
|
|
| See press release from gov.uk See
also full government response to the Online Harms White Paper consultation
|
Digital Secretary Oliver Dowden and Home Secretary Priti Patel have announced the government's final decisions on new internet censorships laws.
New rules to be introduced for nearly all tech firms that allow users to post their own content or interact Firms failing to protect people face fines of up to ten per cent of turnover or the blocking
of their sites and the government will reserve the power for senior managers to be held liable Popular platforms to be held responsible for tackling both legal and illegal harms All platforms will
have a duty of care to protect children using their services Laws will not affect articles and comments sections on news websites, and there will be additional measures to protect free speech
The full government response to the Online Harms White Paper
consultation sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users. The safety of children is at the heart of the measures. Social media
sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The
Government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal. Tech platforms will need to do far more to protect children from being exposed to harmful content or
activity such as grooming, bullying and pornography. This will help make sure future generations enjoy the full benefits of the internet with better protections in place to reduce the risk of harm. The most popular social media
sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological
harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice. Ofcom is now confirmed as the
regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK.
The legislation includes provisions to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously - for example, if they do not
respond fully, accurately and in a timely manner to information requests from Ofcom. This power would be introduced by Parliament via secondary legislation, and reserving the power to compel compliance follows similar approaches in other sectors such as
financial services regulation. The government plans to bring the laws forward in an Online Safety Bill next year and set the global standard for proportionate yet effective regulation. This will safeguard people's rights online
and empower adult users to keep themselves safe while preventing companies arbitrarily removing content. It will defend freedom of expression and the invaluable role of a free press, while driving a new wave of digital growth by building trust in
technology businesses. Scope The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly
interact with others online. It includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer
cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations. The legislation will include safeguards for freedom of expression and pluralism online - protecting
people's rights to participate in society and engage in robust debate. Online journalism from news publishers' websites will be exempt, as will reader comments on such sites. Specific measures will be included in the legislation
to make sure journalistic content is still protected when it is reshared on social media platforms. Categorised approach Companies will have different responsibilities for different categories of
content and activity, under an approach focused on the sites, apps and platforms where the risk of harm is greatest. All companies will need to take appropriate steps to address illegal content and activity such as terrorism and
child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them. This could be, for example, by using tools that give age assurance to ensure children are
not accessing platforms which are not suitable for them. The government will make clear in the legislation the harmful content and activity that the regulations will cover and Ofcom will set out how companies can fulfil their duty
of care in codes of practice. A small group of companies with the largest online presences and high-risk features, likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1. These
companies will need to assess the risk of legal content or activity on their services with "a reasonably foreseeable risk of causing significant physical or psychological harm to adults". They will then need to make clear what type of
"legal but harmful" content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently. All companies will need mechanisms so people can easily report harmful content
or activity while also being able to appeal the takedown of content. Category 1 companies will be required to publish transparency reports about the steps they are taking to tackle online harms. Examples of Category 2 services are
platforms which host dating services or pornography and private messaging apps. Less than three per cent of UK businesses will fall within the scope of the legislation and the vast majority of companies will be Category 2 services.
Exemptions Financial harms will be excluded from this framework, including fraud and the sale of unsafe goods. This will mean the regulations are clear and manageable for businesses, focus action where
there will be most impact, and avoid duplicating existing regulation. Where appropriate, lower-risk services will be exempt from the duty of care to avoid putting disproportionate demands on businesses. This includes exemptions
for retailers who only offer product and service reviews and software used internally by businesses. Email services will also be exempt. Some types of advertising, including organic and influencer adverts that appear on social
media platforms, will be in scope. Adverts placed on an in-scope service through a direct contract between an advertiser and an advertising service, such as Facebook or Google Ads, will be exempt because this is covered by existing regulation.
Private communications The response will set out how the regulations will apply to communication channels and services where users expect a greater degree of privacy - for example online instant
messaging services and closed social media groups which are still in scope. Companies will need to consider the impact on user privacy and that they understand how company systems and processes affect people's privacy, but firms
could, for example, be required to make services safer by design by limiting the ability for anonymous adults to contact children. Given the severity of the threat on these services, the legislation will enable Ofcom to require
companies to use technology to monitor, identify and remove tightly defined categories of illegal material relating to child sexual exploitation and abuse. Recognising the potential impact on user privacy, the government will ensure this is only used as
a last resort where alternative measures are not working. It will be subject to stringent legal safeguards to protect user rights.
|
| |
Ofcom consults about its plans to tool up for its new roles as the UK internet censor
|
|
|
 | 11th December 2020
|
|
| See article from ofcom.org.uk See
Ofcom work plan [pdf] from ofcom.org.uk |
Ofcom has opened a consultation on its plan to get ready for its likely role as the UK internet censor under the Governments Online Harms legislation. Ofcom writes We have today published our plan of work for 2021/22. This
consultation sets out our goals for the next financial year, and how we plan to achieve them. We are consulting on this plan of work to encourage discussion with companies, governments and the public. As
part of the Plan of Work publication, we are also holding some virtual events to invite feedback on our proposed plan. These free events are open to everyone, and offer an opportunity to comment and ask questions. The consultation ends
on 5th February 2021. The Key areas referencing internet censorship are: Preparing to regulate online harms 3.26 The UK Government has given Ofcom new duties as the regulator for UK -established
video - sharing platforms (VSPs) through the transposition of the European -wide Audiovisual Media Services Directive. VSPs are a type of online video service where users can upload and share vide os with members of the public, such as You Tube and
TikTok. Ofcom will not be responsible for regulating all VSPs as our duties only apply to services established in the UK and as such , we anticipate that a relatively small number of services fall within our jurisdiction. Under the new regulations, which
came into force on 1 November 2020, VSPs must have appropriate measures in place to protect children from potentially harmful content and all users from criminal content and incitement to hatred and violence. VSPs will also need to make sure certain
advertising standards are met. 3.27 As well as appointing Ofcom as the regulator of UK- established VSPs the Government has announced that it is minded to appoint Ofcom as the future regulator responsible for protecting users from
harmful online content. With this in mind we are undertaking the following work :
Video-sharing platforms regulation . We have issued a short guide to the new requirements. 22 On 19 November 2020 we issued draft scope and jurisdiction guidance for consultation to help providers self -assess whether they
need to notify to Ofcom as a VSP under the statutory rules from April 2021. 23 We will also consult in early 2021 on further guidance on the risk of harms and appropriate measures as well as proposals for a co-regulatory relationship with the Advertising
Standards Authority (ASA) with regards to VSP advertising. We intend to issue final versions of the guidance in summer 2021. Preparing for the online harms regime. The UK Government has set out that it intends to put
in place a regime to keep people safe online. In February 2020 it published an initial response to the 2019 White Paper24 setting out how it intends to develop the regime which stated that it was minded to appoint Ofcom as the future regulator of online
harms. If confirmed, these proposed new responsibilities would constitute a significant expansion to our remit, and preparing for them would be a major area of focus in 2021/22. We will continue to provide technical advice to the UK Government on its
policy development process, and we will engage with Parliament as it considers legislative proposals.
3.29 We will continue work to deepen our understanding of online harms through a range of work:
Our Making Sense of Media programme. This programme will continue to provide insights on the needs, behaviours and attitudes of people online. Our other initiatives to research online markets and technologies will further
our understanding of how online harms can be mitigated Stepping up our collaboration with other regulators. As discussed in the Developing strong partnerships section, we will continue our joint work through the
Digital Regulators Cooperation Forum and strengthen our collaboration with regulators around the world who are also considering online harms. Understanding VSPs . The introduction of regulation to UK-established VSPs
will provide a solid foundation to inform and develop the broader future online harms regulatory framework. This interim regime is more limited in terms of the number of regulated companies and will cover a narrower range of harms compared to the online
harms white paper proposals. However, should Ofcom be confirmed as the regulator, through our work on VSPs we will develop on-the-job experience working with newly regulated online services, developing the evidence base of online harm, and building our
internal skills and expertise.
|
| |
House of Lords approves adoption of the EU's internet video sharing censorship laws into post Brexit UK law
|
|
|
 | 29th November 2020
|
|
| See House of Lords transcription from
theyworkforyou.com |
The House of Lords approved a statutory instrument that adopts the EU's Audio Visual Media Services Directive into post-Brexit UK law. This law describes state censorship requirements for internet video sharing platforms. The law change was debated on
27th November 2020 with the government introducing the law as follows: Baroness Barran, The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport My Lords, I am pleased to
introduce this instrument, laid in both Houses on 15 October, which is being made under the European Union (Withdrawal) Act 2018. These regulations remedy certain failures of retained EU law arising from the withdrawal of the United Kingdom from the EU.
This instrument seeks to maintain, but not expand, Ofcom's remit to regulate video-sharing platform services. This intervention is necessary to ensure the law remains operable beyond the end of the transition period. The EU's
audiovisual media services directive, known as the AVMS directive, governs the co-ordination of national legislation on audio-visual media services. The AVMS directive was initially implemented into UK law in 2010, primarily by way of amendments to UK
broadcasting legislation. The directive was subsequently revised in 2018. The UK Audiovisual Media Services Regulations 2020, which transposed the revised AVMS directive, were made and laid in Parliament on 30 September. Those regulations came into force
on 1 November and introduced, for the first time, rules for video-sharing platform services. The Government have appointed Ofcom as the regulator for these services. The new rules ensure that platforms falling within UK jurisdiction have appropriate
systems and processes to protect the public, including minors, from illegal and harmful material. There were three key requirements placed on video-sharing platforms under the regulations. These were: to take appropriate measures
to protect minors under 18 from harmful content, to take appropriate measures to protect the general public from harmful and certain illegal content, and to introduce standards around advertising. I also draw the attention of the House to the report from
the Secondary Legislation Scrutiny Committee considering this instrument, and I thank its members for their work. I will now address the committee's concerns regarding jurisdiction. The AVMS directive sets out technical rules
governing when a platform falls within a country's jurisdiction. First, there must be a physical presence, or a group undertaking, of the platform in the country. Where there is a physical presence in more than one country, jurisdiction is decided on the
basis of factors such as whether the platform is established in that country, whether the platform's main economic activity is centred in that country, and the hierarchy of group undertakings as set out by the directive. Under the
revised AVMS directive, each EU member state and the UK is responsible for regulating only the video-sharing platforms that fall within its jurisdiction. There will be only one country that has jurisdiction for each platform at any one time. However, if
a platform has no physical presence in any country covered by the AVMS directive, then no country will have jurisdiction over it, even if the platform provides services in those countries. Through this instrument, we are seeking
to maintain the same position for Ofcom's remit beyond the end of the transition period. This position allows Ofcom to regulate video-sharing platforms established in the UK and additionally regulate platforms that have a physical presence in the UK but
not in any other country covered by the AVMS directive. Although Ofcom's remit will not be extended to include platforms established elsewhere in the EU, we believe UK users will indirectly benefit from the EU's regulation of platforms under the AVMS
directive. The regulation under this regime is systems regulation, not content regulation. We therefore expect that as platforms based outside of the UK will set up and invest in systems to comply with the AVMS regulations, it is probable that these same
systems will also be introduced for their UK subsidiaries. In the absence of this instrument, Ofcom would no longer be able to regulate any video-sharing platforms. This would result in an unacceptable regulatory gap and a lack of
protection for UK users using these services. Our approach also mitigates the small risk that a video- sharing platform offering services to countries covered by the AVMS directive, but not the UK, would establish itself in the UK in order to circumvent
EU law. While we recognise that most children have a positive experience online, the reality is that the impact of harmful content and activity online can be particularly damaging for children. Over three-quarters of UK adults
also express a deep concern about the internet. The UK is one of only three countries to have transposed the revised directive thus far, evidencing our commitment to protecting users online. These regulations also pave the way for
the upcoming online harms regulatory regime. Given that the online harms regulatory framework shares broadly the same objectives as the video-sharing platform regime, it is the Government's intention that the regulation of video-sharing platforms in the
UK will be superseded by the online harms legislation, once the latter comes into force. Further details on the plans for online harms regulation will be set out in the full government response to the consultation on the Online Harms White Paper, which
is due to be published later this year, with draft legislation ready in early 2021. With that, I beg to move. |
| |
Ofcom publishes its censorship guidelines to be applied to UK based video sharing platforms
|
|
|
 | 21st
October 2020
|
|
| See article from ofcom.org.uk See
censorship guidelines [pdf] from ofcom.org.uk |
Ofcom has published its burdensome censorship rules that will apply to video sharing platforms that are stupid enough to be based in the UK. In particular the rules are quite vague about age verification requirements for the two adult video sharing sites
that remain in the UK. Maybe Ofcom is a bit shy about requiring onerous and unviable red tape of British companies trying to compete with large numbers of foreign companies that operate with a massive commercial advantage of not having age verification.
Ofcom do however note that these censorship rules are a stop gap until a wider scoped 'online harms' censorship regime which will start up in the next couple of years. Ofcom writes: Video-sharing platforms
(VSPs) are a type of online video service which allows users to upload and share videos with members of the public. From 1 November 2020, UK-established VSPs will be required to comply with new rules around protecting users from
harmful content. The main purpose of the new regulatory regime is to protect consumers who engage with VSPs from the risk of viewing harmful content. Providers must have appropriate measures in place to protect minors from content
which might impair their physical, mental or moral development; and to protect the general public from criminal content and material likely to incite violence or hatred. Ofcom has published a short guide outlining the new
statutory requirements on providers. The guide is intended to assist platforms to determine whether they fall in scope of the new regime and to understand what providers need to do to ensure their services are compliant. The guide
also explains how Ofcom expects to approach its new duties in the period leading up to the publication of further guidance on the risk of harms and appropriate measures, which we will consult on in early 2021. Ofcom will also be
consulting on guidance on scope and jurisdiction later in 2020. VSP providers will be required to notify their services to Ofcom from 6 April 2021 and we expect to have the final guidance in place ahead of this time.
|
| |
The Government's Online Harms bill will require foreign social media companies to appoint a token fall guy in Britain who will be jailed should the company fail in its duty of care. I wonder what the salary will be?
|
|
|
 | 31st
December 2019
|
|
| From The Times |
The government is pushing forward with an internet censorship bill which will punish people and companies for getting it wrong without the expense and trouble of tying to dictate rules on what is allowed. In an interesting development the Times is
reporting that the government want to introduce a "senior management liability", under which executives could be held personally responsible for breaches of standards. US tech giants would be required to appoint a British-based director, who
would be accountable for any breaches of the censorship rules. It seems a little unjust to prosecute a token fall guy who is likely to have absolutely no say in the day to day decisions made by a foreign company. Still it should be a very well
paid job which hopefully includes lots of coverage for legal bills and a zero notice period allowing instant resignation at the first hint of trouble. |
| |
MPs and campaigners call for 'misogyny' to be defined as on 'online harm' requiring censorship by social media. What could go wrong?
|
|
|
 | 7th
September 2019
|
|
| See article from theguardian.com
|
MPs and activists have urged the government to protect women through censorship. They write in a letter Women around the world are 27 times more likely to be harassed online than men. In Europe, 9 million girls have experienced
some kind of online violence by the time they are 15 years old. In the UK, 21% of women have received threats of physical or sexual violence online. The basis of this abuse is often, though not exclusively, misogyny. Misogyny
online fuels misogyny offline. Abusive comments online can lead to violent behaviour in real life. Nearly a third of respondents to a Women's Aid survey said where threats had been made online from a partner or ex-partner, they were carried out. Along
with physical abuse, misogyny online has a psychological impact. Half of girls aged 11-21 feel less able to share their views due to fear of online abuse, according to Girlguiding UK . The government wants to make Britain the
safest place in the world to be online, yet in the online harms white paper, abuse towards women online is categorised as harassment, with no clear consequences, whereas similar abuse on the grounds of race, religion or sexuality would trigger legal
protections. If we are to eradicate online harms, far greater emphasis in the government's efforts should be directed to the protection and empowerment of the internet's single largest victim group: women. That is why we back the
campaign group Empower's calls for the forthcoming codes of practice to include and address the issue of misogyny by name, in the same way as they would address the issue of racism by name. Violence against women and girls online is not harassment.
Violence against women and girls online is violence. Ali Harris Chief executive, Equally Ours Angela Smith MP Independent Anne Novis Activist Lorely Burt Liberal Democrat, House of Lords
Ruth Lister Labour, House of Lords Barry Sheerman MP Labour Caroline Lucas MP Green Daniel Zeichner MP Labour Darren Jones MP Labour Diana Johnson MP Labour Flo Clucas Chair,
Liberal Democrat Women Gay Collins Ambassador, 30% Club Hannah Swirsky Campaigns officer, René Cassin Joan Ryan MP Independent Group for Change Joe Levenson Director of communications and campaigns, Young
Women's Trust Jonathan Harris House of Lords, Labour Luciana Berger MP Liberal Democrats Mandu Reid Leader, Women's Equality Party Maya Fryer WebRoots Democracy Preet Gill MP Labour Sarah Mann
Director, Friends, Families and Travellers Siobhan Freegard Founder, Channel Mum Jacqui Smith Empower
Offsite Patreon Comment: What will go wrong? See subscription article from patreon.com
|
| |
Monday is the last day to respond and the Open Rights Group makes some suggestions
|
|
|
 | 30th June 2019
|
|
| See article from openrightsgroup.org |
The Government is accepting public feedback on their plan until Monday 1 July. Send a message to their consultation using Open Rights Group tool before the end of Monday!
The Open Rights Group comments on the government censorship plans: Online Harms: Blocking websites doesn't work -- use a rights-based approach instead Blocking websites isn't working. It's not
keeping children safe and it's stopping vulnerable people from accessing information they need. It's not the right approach to take on Online Harms. This is the finding from our
recent research into website blocking by mobile and broadband
Internet providers. And yet, as part of its Internet regulation agenda, the UK Government wants to roll out even more blocking. The Government's Online Harms White Paper is focused on making online companies fulfil a "duty
of care" to protect users from "harmful content" -- two terms that remain troublingly ill-defined. 1
The paper proposes giving a regulator various punitive measures to use against companies that fail to fulfil this duty, including powers to block websites. If this scheme comes into effect, it could lead to
widespread automated blocking of legal content for people in the UK. Mobile and broadband Internet providers have been blocking websites with parental control filters for five years. But through our
Blocked project -- which detects incorrect website blocking -- we know that systems are still blocking far too many sites and far too many types of sites by mistake.
Thanks to website blocking, vulnerable people and under-18s are losing access to crucial information and support from websites including counselling, charity, school, and sexual health websites. Small businesses are
losing customers. And website owners often don't know this is happening. We've seen with parental control filters that blocking websites doesn't have the intended outcomes. It restricts access to legal, useful,
and sometimes crucial information. It also does nothing to prevent people who are determined to get access to material on blocked websites, who often use VPNs to get around the filters. Other solutions like filters applied by a parent to a child's
account on a device are more appropriate. Unfortunately, instead of noting these problems inherent to website blocking by Internet providers and rolling back, the Government is pressing ahead with website blocking in other areas.
Blocking by Internet providers may not work for long. We are seeing a technical shift towards encrypted website address requests that will make this kind of website blocking by Internet providers much more
difficult. When I type a human-friendly web address such as openrightsgroup.org into a web browser and hit enter, my computer asks a Domain Name System (DNS) for that website's computer-friendly IP address - which will
look something like 46.43.36.233 . My web browser can then use that computer-friendly address to load the website. At the moment, most DNS requests are unencrypted. This allows mobile and broadband Internet providers to
see which website I want to visit. If a website is on a blocklist, the system won't return the actual IP address to my computer. Instead, it will tell me that that site is blocked, or will tell my computer that the site doesn't exist. That stops me
visiting the website and makes the block effective. Increasingly, though, DNS requests are being encrypted. This provides much greater security for ordinary Internet users. It also makes website blocking by Internet providers
incredibly difficult. Encrypted DNS is becoming widely available through Google's Android devices, on Mozilla's Firefox web browser and through Cloudflare's mobile application for Android and iOS. Other encrypted DNS services are also available.
Our report DNS Security - Getting it Right discusses issues around encrypted DNS in more detail.
Blocking websites may be the Government's preferred tool to deal with social problems on the Internet but it doesn't work, both in policy terms and increasingly at a technical level as well. The Government must accept that website blocking by mobile and broadband Internet providers is not the answer. They should concentrate instead on a rights-based approach to Internet regulation and on educational and social approaches that address the roots of complex societal issues.
Offsite Article: CyberLegal response to the Online Harms Consultation 30th June 2019. See article from cyberleagle.com
Speech is not a tripping hazard |
| |
|
|
|
 | 15th
June 2019
|
|
|
Who'd have thought that a Christian Campaign Group would be calling on its members to criticise the government's internet censorship bill in a consultation See
article from christianconcern.com |
| |
|
|
|
 | 20th April 2019
|
|
|
An interesting look at the government's Online Harms white paper proposing extensive internet censorship for the UK See article from cyberleagle.com
|
| |
Status report on the government's plans to introduce an internet censor for social media
|
|
|
 | 30th January 2019
|
|
| See article from politico.eu See
also Matt Hancock tells social media giants to remove suicide and self-harm material from telegraph.co.uk |
The U.K. government is rushing to finalize a draft internet censorship law particularly targeting social media but key details of the proposal have yet to be finalised amid concerns about stifling innovation. Government officials have been meeting
with industry players, MPs, peers and other groups over the past month as they try to finalise their proposals. People involved in those discussions said there is now broad agreement about the need to impose a new duty of care on big tech
companies, as well as the need to back up their terms and conditions with the force of law. A white paper is due be published by the end of winter. But the Department for Digital, Culture, Media and Sport, which is partly responsible for writing
up the new rules alongside the Home Office, is still deliberating over key aspects with just weeks to go until the government said it would unveil an outline of its proposals. Among the sticking points are worries that regulation could stifle
innovation in one of the U.K. economy's most thriving sectors and concerns over whether it can keep pace with rapid technological change. Another is ensuring sufficient political support to pass the law despite likely opposition from parts of the
Conservative Party. A third is deciding what regulatory agency would ultimately be responsible for enforcing the so-called Internet Safety Law. A major unresolved question is what censorship body will be in charge of enforcing laws that could
expose big tech companies to greater liability for hosted content, a prospect that firms including Google and Facebook have fought at the European level. Several people who spoke to POLITICO said the government does not appear to have settled on
who would be the censor, although the communications regulator Ofcom is very much in the mix, however there are concerns that Ofcom is already getting too big. |
| |