Theresa May is creating a new national security unit to counter supposed fake news and disinformation spread by Russia and other foreign powers, Downing Street has announced.
The Prime Minister's official spokesman said the new national security communications unit would build on existing capabilities and would be tasked with combating disinformation by state actors and others. The spokesman said:
We are living in an era of fake news and competing narratives. The government will respond with more and better use of national security communications to tackle these interconnected, complex challenges.
To do this we will build on existing capabilities by creating a dedicated national security communications unit. This will be tasked with combating disinformation by state actors and others.
The new unit has already been dubbed the Ministry of Truth.
A committee of MPs has claimed that the government is not taking the urgent action needed to protect democracy from fake news on Facebook and other social media.
The culture committee wants a crackdown on the manipulation of personal data, the spread of disinformation and Russian interference in elections. Tory MP Damian Collins, who chairs the committee, says he is disappointed by the response to its
latest report. Collins has accused ministers of making excuses to further delay desperately needed announcements on the ongoing issues of harmful and misleading content being spread through social media.
When the Digital Culture Media and Sport Committee issued its interim report on fake news in July it claimed that the UK faced a democratic crisis founded on the manipulation of personal data.
The MPs called for new powers for the Electoral Commission - including bigger fines - and new regulation of social media firms. But of the 42 recommendations in its interim report, the committee says only three have been accepted by the
government, in its official response, published last week.
The committee has backed calls from the Electoral Commission to force social media advertisers to publish an imprint on political ads to show who had paid for them, to increase transparency. Collins also criticised the government's continued
insistence that there was no evidence of Russian interference in UK elections.
Collins said he would be raising this and other issues with Culture Secretary Jeremy Wright, when he appears before the committee on Wednesday.
The likes of Facebook and Twitter should fund the creation of a new UK watchdog to internet censor to police fake news, censorship campaigners have claimed.
Sounding like a religious morality campaign, the LSE Commission on Truth, Trust and Technology , a group made up of MPs, academics and industry, also proposed the Government should scrap plans to hand fresh powers to existing cesnors such
as Ofcom and the Information Commissioner.
The campaigners argue for the creation a new body to monitor the effectiveness of technology companies' self regulation. The body, which would be called the Independent Platform Agency, would provide a permanent forum for monitoring and
cesnorsing the behaviour of online sites and produce an annual review of the state of disinformation, the group said.
Damian Tambini, adviser to the LSE commission and associate professor in LSE's department of media and communications, claimed:
Parliament, led by the Government, must take action to ensure that we have the information and institutions we need to respond to the information crisis. If we fail to build transparency and trust through independent institutions we could see
the creeping securitisation of our media system.
The Digital, Culture, Media and Sport Committee has published its final report on Disinformation and 'fake news'. The report calls for:
Compulsory Code of Ethics for tech companies overseen by independent regulator
Regulator given powers to launch legal action against companies breaching code
Government to reform current electoral communications laws and rules on overseas involvement in UK elections
Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation
Further finds that:
Electoral law 'not fit for purpose'
Facebook intentionally and knowingly violated both data privacy and anti-competition laws
Damian Collins MP, Chair of the DCMS Committee said:
"Our inquiry over the last year has identified three big threats to our society. The challenge for the year ahead is to start to fix them; we cannot delay any longer.
"Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use everyday. Much of this
is directed from agencies working in foreign countries, including Russia.
"The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.
"Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.
"These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the 'move fast and break things' culture often seems to be that it is better to apologise than ask permission.
"We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies
to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.
"We also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world.
More needs to be done to require major donors to clearly establish the source of their funds.
"Much of the evidence we have scrutinised during our inquiry has focused on the business practices of Facebook; before, during and after the Cambridge Analytica data breach scandal.
"We believe that in its evidence to the Committee Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.
"Even if Mark Zuckerberg doesn't believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world. Evidence uncovered by my Committee shows he still has questions to answer yet he's continued to duck
them, refusing to respond to our invitations directly or sending representatives who don't have the right information. Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from
someone who sits at the top of one of the world's biggest companies.
"We also repeat our call to the Government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics. We want to find out what was the impact of disinformation and voter
manipulation on past elections including the UK Referendum in 2016 and are calling on the Government to launch an independent investigation."
This Final Report on Disinformation and 'Fake News' repeats a number of recommendations from the interim report published last summer. The Committee calls for the Government to reconsider a number of recommendations to which it did not respond
and to include concrete proposals for action in its forthcoming White Paper on online harms.
Independent regulation of social media companies.
The Report repeats a recommendation from the Interim Report for clear legal liabilities to be established for tech companies to act against harmful or illegal content on their sites, and the report calls for a compulsory Code of Ethics defining
what constitutes harmful content. An independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code.
Companies failing obligations on harmful or illegal content would face hefty fines. MPs conclude: "Social media companies cannot hide behind the claim of being merely a 'platform' and maintain that they have no responsibility themselves in
regulating the content of their sites."
The Report's recommendation chimes with recent statements by Ministers indicating the Government is prepared to regulate social media companies following the death of teenager Molly Russell. The Committee hopes to see firm recommendations for
legislation in the White Paper to create a regulatory system for online content that is as effective as that for offline content.
It repeats its recommendation for new independent regulation to be funded by a levy on tech companies operating in the UK.
Data use and data targeting
The Report highlights Facebook documents obtained by the Committee and published in December 2018 relating to a Californian court case brought by app developer Six4Three. Through scrutiny of internal Facebook emails between 2011 and 2015, the
Report finds evidence to indicate that the company was willing to: override its users' privacy settings in order to transfer data to some app developers; to charge high prices in advertising to some developers, for the exchange of data, and
starve some developers--such as Six4Three--of that data, contributing to them losing their business. MPs conclude: "It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws."
It recommends that the ICO carries out a detailed investigation into the practices of the Facebook platform, its use of users' and users' friends' data, and the use of 'reciprocity' of the sharing of data. The CMA (Competition and Markets
Authority) should conduct a comprehensive audit of the advertising market on social media and investigate whether Facebook has been involved in anti-competitive practices.
MPs note that Facebook, in particular, is unwilling to be accountable to regulators around the world: "By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has
shown contempt towards both our Committee and the 'International Grand Committee' involving members from nine legislators from around the world."