Melon Farmers Unrated

Online Safety Bill


UK Government legislates to censor social media


 

Self identified 'harms'...

Ofcom publishes report seemingly trying categorise or classify these 'harms' and associated risks with view to its future censorship role


Link Here25th September 2022
Full story: Online Safety Bill...UK Government legislates to censor social media
Ofcom writes:

The Online Safety Bill, as currently drafted, will require Ofcom to assess, and publish its findings about the risks of harm arising from content that users may encounter on in-scope services, and will require in-scope services to assess the risks of harm to their users from such content, and to have systems and processes for protecting individuals from harm.

Online users can face a range of risks online, and the harms they may experience are wide-ranging, complex and nuanced. In addition, the impact of the same harms can vary between users. In light of this complexity, we need to understand the mechanisms by which online content and conduct may give rise to harm, and use that insight to inform our work, including our guidance to regulated services about how they might comply with their duties.

This report sets out a generic model for understanding how online harms manifest. This research aimed to test a framework, developed by Ofcom, with real-life user experiences. We wanted to explore if there were common risks and user experiences that could provide a single framework through which different harms could be analysed. There are a couple of important considerations when reading this report:

  • The research goes beyond platforms' safety systems and processes to help shed broader light on what people are experiencing online. It therefore touches on issues that are beyond the scope of the proposed online safety regime.

  • The research reflects people's views and experiences of their online world: it is based on people self- identifying as having experienced 'significant harm', whether caused directly or indirectly, or 'illegal content'. Participants' definitions of harmful and illegal content may differ and do not necessarily align with how the Online Safety Bill, Ofcom or others may define them.

 

 

Truss tweaks...

UK Online Censorship Bill set to continue after 'tweaks'


Link Here16th September 2022
Full story: Online Safety Bill...UK Government legislates to censor social media
After a little distraction for the royal funeral, the UK's newly elected prime minister has said she will be continuing with the Online Censorship Bill. She said:

We will be proceeding with the Online Safety Bill. There are some issues that we need to deal with. What I want to make sure is that we protect the under-18s from harm and that we also make sure free speech is allowed, so there may be some tweaks required, but certainly he is right that we need to protect people's safety online.

TechDirt comments:

This is just so ridiculously ignorant and uninformed. The Online Safety Bill is a disaster in waiting and I wouldn't be surprised if some websites chose to exit the UK entirely rather than continue to deal with the law.

It won't actually protect the children, of course. It will create many problems for them. It won't do much at all, except make internet companies question whether it's even worth doing business in the UK.

 

 

Not fit for purpose...

British Computer Society experts are not impressed by The Online Censorship Bill


Link Here15th August 2022
Full story: Online Safety Bill...UK Government legislates to censor social media

Plans to compel social media platforms to tackle online harms are not fit for purpose according to a new poll of IT experts.

Only 14% of tech professionals believed the Online Harms Bill was fit for purpose, according to the survey by BCS, The Chartered Institute for IT.

Some 46% said the bill was not workable, with the rest unsure.

The legislation would have a negative effect on freedom of speech, most IT specialists (58%) told BCS.

Only 19% felt the measures proposed would make the internet safer, with 51% saying the law would not make it safer to be online.

There were nearly 1,300 responses from tech professionals to the survey by BCS.

Just 9% of IT specialists polled said they were confident that legal but harmful content could be effectively and proportionately removed.

Some 74% of tech specialists said they felt the bill would do nothing to stop the spread of disinformation and fake news.

 

 

The UK's Online Censorship Bill...

Legal analysis of UK internet censorship proposals


Link Here 5th July 2022
Full story: Online Safety Bill...UK Government legislates to censor social media

Offsite Article: French lawyers provide the best summary yet

15th June 2022. See article from taylorwessing.com

 

Offsite Article: Have we opened Pandora's box?

20th June 2022. See article from tandfonline.com

Abstract

In thinking about the developing online harms regime (in the UK and elsewhere1) it is forgivable to think only of how laws placing responsibility on social media platforms to prevent hate speech may benefit society. Yet these laws could have insidious implications for free speech. By drawing on Germany's Network Enforcement Act I investigate whether the increased prospect of liability, and the fines that may result from breaching the duty of care in the UK's Online Safety Act - once it is in force - could result in platforms censoring more speech, but not necessarily hate speech, and using the imposed responsibility as an excuse to censor speech that does not conform to their objectives. Thus, in drafting a Bill to protect the public from hate speech we may unintentionally open Pandora's Box by giving platforms a statutory justification to take more control of the message.

See full article from tandfonline.com

 

Offsite Article: The Online Safety Act - An Act of Betrayal

5th July 2022. See article from ukcolumn.org by Iain Davis

The Online Safety Bill (OSB) has been presented to the public as an attempt to protect children from online grooming and abuse and to limit the reach of terrorist propaganda.

This, however, does not seem to be its primary focus. The real objective of the proposed Online Safety Act (OSA) appears to be narrative control.

 

 

Offsite Article: The Online Unsafety Bill...


Link Here26th June 2022
Full story: Online Safety Bill...UK Government legislates to censor social media
Sex work could become 10 times more dangerous due to online safety bill, says sex work group

See article from mancunianmatters.co.uk

 

 

Holy gaslighting...

The clergy with such an appalling record of child abuse presumes to pontificate to everybody else about porn


Link Here26th June 2022
Full story: Online Safety Bill...UK Government legislates to censor social media
The Guildford Diocesan Synod has submitted a motion to the General Synod, the Church of England's legislative body, seeking to prevent children and young people from online exposure to pornography. The General Synod, which takes place in York next month, will consider the motion.

In the papers published last week, the Rev Charleen Hollington, a member of the Leatherhead Deanery Synod, Guildford, wrote:

Access to pornography means that a distorted and harmful view of what constitutes normal sexual relations is being absorbed by each new generation of children and young people.

This is placing pressure on young boys and girls to conform to stereotypes of domination on the one hand and submission and degradation on the other, and is creating a wider culture of abusive attitudes towards girls and women.

A law requiring age verification for access to commercial porn sites was meant to come into effect in 2018, but it never did for reasons having to do with bureaucratic delay and then a changed approach by the Government. 'Increase awareness of harms of pornography'

Hollington also criticised the Department for Digital, Culture, Media and Sport's Bill, saying:

If passed, the proposed legislation will go some way to addressing the problems. However, legislation introduced in 2018 which was designed to require age verification for access to commercial porn sites never came into effect.

Therefore, the need for the motion to be passed by General Synod now remains as strong as it has always been. The motion acknowledges the current problem, asks the Government to take action and recommends programmes to increase awareness of the harms of pornography.

 

 

Perhaps drug dealers will find a new sideline in selling memory sticks full of porn...

Surveyed porn users indicate that they will be unlikely to hand over their identity documents to for age verification


Link Here22nd April 2022
Full story: Online Safety Bill...UK Government legislates to censor social media
So what will porn users do should their favourite porn site succumb to age verification. Will they decide to use a VPN, or else try Tor, or perhaps exchange porn with their friends, or perhaps their will be an opportunity for a black market to spring up. Another option would be to seek out lesser known foreign porn sites that van fly under the radar.

All of these options seem more likely than users dangerously handing over identity documents to any porn website that asks.

According to a new survey from YouGov, 78% of the 2,000 adults surveyed would not be willing to verify their age to access adult websites by uploading a document linked to their identity such as a driver's license, passport or other ID card.

Of the participants who believe that visiting adult websites can be part of a healthy sexual lifestyle, just 17% are willing to upload their ID.

The main reasons for their decisions were analysed. 64% just don't trust the companies to keep their data safe while 63% are scared their information could end up in the wrong hands. 49% are concerned about adult websites suffering data breaches which could expose their personal information.

Director of the privacy campaigner Open Rights Group, Jim Killock explained in a press release that those who want to access adult websites anonymously will just use a VPN if the UK's Online Safety legislation passes, saying:

The government assumes that people will actually upload their ID to access adult content. The data shows that this is a naive assumption. Instead, adults will simply use a VPN (as many already do) to avoid the step, or they'll go to smaller, unmoderated sites which exist outside the law. Smaller adult sites tend to be harder to regulate and could potentially expose users204including minors204to more extreme or illegal content.

 

 

Censorship monstrosity...

The UK govenment's Online Censorship Bill will get a 2nd reading debate in the House of Commons on Tuesday 19th April


Link Here18th April 2022
Full story: Online Safety Bill...UK Government legislates to censor social media
Repressive new censorship laws return to Parliament for their second reading this week.

Online censorship legislation will be debated in the Commons Comes as new plans to support some people and fight  deemed falsities online are launched Funding boost will help people's critical thinking online through a new expert Media Literacy Taskforce alongside proposals to pay for training for teachers and library workers

Parliamentarians will debate the government's groundbreaking Online Censorship Bill which requires social media platforms, search engines and other apps and websites allowing people to post content to censor 'wrong think' content.

Ofcom, the official state censor, will have the power to fine companies failing to comply with the laws up to ten per cent of their annual global turnover, force them to improve their practices and block non-compliant sites. Crucially, the laws have strong measures to safeguard children from harmful content such as pornography and child sexual abuse.

 

 

Online Censorship Bill...

Sex workers speak out against the upcoming censorship of their trade


Link Here16th April 2022
Full story: Online Safety Bill...UK Government legislates to censor social media

The Online Safety Bil l was published on 12 May 2021 with the stated aim of cracking down on harmful content online. A clause has now been added to the bill to include the offence of inciting or controlling prostitution for gain as one of the priority offences that tech companies have to look out for -- firms would then be obliged to remove any content from their platforms that could be construed as committing this offence.

This would be disastrous for sex workers as it would undoubtably lead to advertising platforms clamping down on sex workers' advertisements.in order to avoid any chance of being prosecuted -- essentially criminalising the online advertising of sex work.

Controlling prostitution for gain is interpreted very widely in the criminal courts. Some women in the ECP have been prosecuted under this offence just for helping a friend build a website or place an advert. Our experience shows that in any crackdown like this, migrant and women of colour are particularly targeted.

Research shows that online advertising has enabled sex workers to work more safely and independently from exploitative bosses, to screen clients and have more control over our working conditions. Preventing sex workers from advertising will increase violence and the risk of attack. Similar legislation (SESTA/FOSTA) was passed into law by Trump in the US in 2018 resulting in an increase in poverty, insecure housing, suicide, murder, isolation, and the deterioration of physical and mental health for sex workers.

 

 

Vladimir would be proud...

UK Government introduces its Online Censorship Bill which significantly diminishes British free speech whilst terrorising British businesses with a mountain of expense and red tape


Link Here17th March 2022
Full story: Online Safety Bill...UK Government legislates to censor social media
The UK government's new online censorship laws have been brought before parliament. The Government wrote in its press release:

The Online Safety Bill marks a milestone in the fight for a new digital age which is safer for users and holds tech giants to account. It will protect children from harmful content such as pornography and limit people's exposure to illegal content, while protecting freedom of speech.

It will require social media platforms, search engines and other apps and websites allowing people to post their own content to protect children, tackle illegal activity and uphold their stated terms and conditions.

The regulator Ofcom will have the power to fine companies failing to comply with the laws up to ten per cent of their annual global turnover, force them to improve their practices and block non-compliant sites.

Today the government is announcing that executives whose companies fail to cooperate with Ofcom's information requests could now face prosecution or jail time within two months of the Bill becoming law, instead of two years as it was previously drafted.

A raft of other new offences have also been added to the Bill to make in-scope companies' senior managers criminally liable for destroying evidence, failing to attend or providing false information in interviews with Ofcom, and for obstructing the regulator when it enters company offices.

In the UK, tech industries are blazing a trail in investment and innovation. The Bill is balanced and proportionate with exemptions for low-risk tech and non-tech businesses with an online presence. It aims to increase people's trust in technology, which will in turn support our ambition for the UK to be the best place for tech firms to grow.

The Bill will strengthen people's rights to express themselves freely online and ensure social media companies are not removing legal free speech. For the first time, users will have the right to appeal if they feel their post has been taken down unfairly.

It will also put requirements on social media firms to protect journalism and democratic political debate on their platforms. News content will be completely exempt from any regulation under the Bill.

And, in a further boost to freedom of expression online, another major improvement announced today will mean social media platforms will only be required to tackle 'legal but harmful' content, such as exposure to self-harm, harassment and eating disorders, set by the government and approved by Parliament.

Previously they would have had to consider whether additional content on their sites met the definition of legal but harmful material. This change removes any incentives or pressure for platforms to over-remove legal content or controversial comments and will clear up the grey area around what constitutes legal but harmful.

Ministers will also continue to consider how to ensure platforms do not remove content from recognised media outlets.

Bill introduction and changes over the last year

The Bill will be introduced in the Commons today. This is the first step in its passage through Parliament to become law and beginning a new era of accountability online. It follows a period in which the government has significantly strengthened the Bill since it was first published in draft in May 2021. Changes since the draft Bill include:

  • Bringing paid-for scam adverts on social media and search engines into scope in a major move to combat online fraud .

  • Making sure all websites which publish or host pornography , including commercial sites, put robust checks in place to ensure users are 18 years old or over.

  • Adding new measures to clamp down on anonymous trolls to give people more control over who can contact them and what they see online.

  • Making companies proactively tackle the most harmful illegal content and criminal activity quicker.

  • Criminalising cyberflashing through the Bill.

Criminal liability for senior managers

The Bill gives Ofcom powers to demand information and data from tech companies, including on the role of their algorithms in selecting and displaying content, so it can assess how they are shielding users from harm.

Ofcom will be able to enter companies' premises to access data and equipment, request interviews with company employees and require companies to undergo an external assessment of how they're keeping users safe.

The Bill was originally drafted with a power for senior managers of large online platforms to be held criminally liable for failing to ensure their company complies with Ofcom's information requests in an accurate and timely manner.

In the draft Bill, this power was deferred and so could not be used by Ofcom for at least two years after it became law. The Bill introduced today reduces the period to two months to strengthen penalties for wrongdoing from the outset.

Additional information-related offences have been added to the Bill to toughen the deterrent against companies and their senior managers providing false or incomplete information. They will apply to every company in scope of the Online Safety Bill. They are:

  • offences for companies in scope and/or employees who suppress, destroy or alter information requested by Ofcom;

  • offences for failing to comply with, obstructing or delaying Ofcom when exercising its powers of entry, audit and inspection, or providing false information;

  • offences for employees who fail to attend or provide false information at an interview.

Falling foul of these offences could lead to up to two years in imprisonment or a fine.

Ofcom must treat the information gathered from companies sensitively. For example, it will not be able to share or publish data without consent unless tightly defined exemptions apply, and it will have a responsibility to ensure its powers are used proportionately.

Changes to requirements on 'legal but harmful' content

Under the draft Bill, 'Category 1' companies - the largest online platforms with the widest reach including the most popular social media platforms - must address content harmful to adults that falls below the threshold of a criminal offence.

Category 1 companies will have a duty to carry risk assessments on the types of legal harms against adults which could arise on their services. They will have to set out clearly in terms of service how they will deal with such content and enforce these terms consistently. If companies intend to remove, limit or allow particular types of content they will have to say so.

The agreed categories of legal but harmful content will be set out in secondary legislation and subject to approval by both Houses of Parliament. Social media platforms will only be required to act on the priority legal harms set out in that secondary legislation, meaning decisions on what types of content are harmful are not delegated to private companies or at the whim of internet executives.

It will also remove the threat of social media firms being overzealous and removing legal content because it upsets or offends someone even if it is not prohibited by their terms and conditions. This will end situations such as the incident last year when TalkRadio was forced offline by YouTube for an "unspecified" violation and it was not clear on how it breached its terms and conditions.

The move will help uphold freedom of expression and ensure people remain able to have challenging and controversial discussions online.

The DCMS Secretary of State has the power to add more categories of priority legal but harmful content via secondary legislation should they emerge in the future. Companies will be required to report emerging harms to Ofcom.

Proactive technology

Platforms may need to use tools for content moderation, user profiling and behaviour identification to protect their users.

Additional provisions have been added to the Bill to allow Ofcom to set expectations for the use of these proactive technologies in codes of practice and force companies to use better and more effective tools, should this be necessary.

Companies will need to demonstrate they are using the right tools to address harms, they are transparent, and any technologies they develop meet standards of accuracy and effectiveness required by the regulator. Ofcom will not be able to recommend these tools are applied on private messaging or legal but harmful content.

Reporting child sexual abuse

A new requirement will mean companies must report child sexual exploitation and abuse content they detect on their platforms to the National Crime Agency .

The CSEA reporting requirement will replace the UK's existing voluntary reporting regime and reflects the Government's commitment to tackling this horrific crime.

Reports to the National Crime Agency will need to meet a set of clear standards to ensure law enforcement receives the high quality information it needs to safeguard children, pursue offenders and limit lifelong re-victimisation by preventing the ongoing recirculation of illegal content.

In-scope companies will need to demonstrate existing reporting obligations outside of the UK to be exempt from this requirement, which will avoid duplication of company's efforts.




 

melonfarmers icon

Home

Index

Links

Email

Shop
 


US

World

Media

Nutters

Liberty
 

Film Cuts

Cutting Edge

Info

Sex News

Sex+Shopping
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys