The European Commission has drafted new laws to force ISPs to block child porn. The measure will be voted on by the European
Parliament next month. The technical solutions envisaged are broadly based on arrangements in the UK, where all major ISPs block access to child abuse websites named on a list maintained by the Internet Watch Foundation (IWF).
If the laws are passed as proposed, the UK government will get powers to force the small ISPs who do not use the IWF blocklist – who serve less than 2% of British internet users – to fall into line. Last year the Home Office abandoned
a pledge to enforce 100% compliance.
Although voluntary, the British system is not without controversy, and EuroISPA, the European ISP trade association, is lobbying MEPs to reject the move to enforce it across the bloc.
Malcolm Hutty, the President of EuroISPA, said: In order to make the Directive on child sexual exploitation as strong as possible, emphasis must be placed on making swift notice and takedown of child sexual abuse material
focused and effective. Blocking, as an inefficient measure, should be avoided. Law enforcement authorities' procedures for rapid communication to internet hosting providers of such illegal material must be reviewed and bottlenecks eliminated.
Cecilia Malmstrom, the European commissioner for home affairs, is worried that MEPs' amendments to a draft directive on the sexual abuse and exploitation of children would make it more difficult for EU member states to block access to websites carrying
The European Parliament's civil liberties committee is to vote on the European Commission's proposal and MEPs' amendments on 14th February.
At present, it is up to member states whether they want to block websites such content. The Commission is seeking to introduce an obligation on all member states to block access in cases where their removal is impossible.
A majority of member states back the mandatory blocking of internet sites but the measure has run into trouble with MEPs. Germany, Ireland and Luxembourg have also openly rejected the measure.
Some of the hundreds of amendments to the draft regulation put forward by MEPs would introduce EU-wide rules that would make it more difficult for member states to continue blocking websites. Many MEPs are concerned about the implications of website
blocking for freedom of speech.
I am a liberal, I consider free speech as a fundamental value and I have fought for that all my life, so accusations that I'm trying to censor the internet and limit freedom of speech really go to my heart because that is absolutely not what I'm
trying to do, Malmstro m said. But I have seen those pictures; they have nothing to do with freedom of speech. This is a horrible violation.
She also rejected the slippery-slope argument -- the notion that once the EU imposed rules on blocking access to one type of website, it could do so for other types in the future. I intend in no way to propose any other type of blocking for any other
thing, but this particular crime demands particular attention.
The European parliament's civil liberties, justice and home affairs committee (LIBE) will meet in Strasbourg tomorrow, when it is expected to approve a controversial measure that would compel EU member states to inform internet publishers that their
images are to be deleted from the internet or blocked for reasons of child pornography.
Publishers will also have to be informed of their right to appeal against any removal or blocking.
The measure would make the UK's system for blocking and removing child pornography without informing the publisher illegal.
MEPs seem more concerned with the rights of child pornographers than they do with the rights of children who have been sexually abused to make their foul, illegal images, said John Carr, the secretary of the Children's Charities Coalition on
Internet Safety (And an adviser to the UK government on child internet safety!)
Surely it is non-child porn publishers that can appeal. If they can show that their sites are legal then it is absolutely correct that they should be able to prove their point.
On the other hand, child pornographers would simply have no case on which to make an appeal, their material is illegal, and will stay removed or blocked.
The EU has taken a step towards common rules against those who sexually abuse children and post images of the abuse on the internet.
A committee of Euro MPs backed an EU draft directive calling for child abuse images to be removed at source.
Where removal is impossible - for example, because web pages are hosted outside the EU - then the abuse images may be blocked by national authorities.
MEPs aim to adopt the new rules later this year, after further negotiations.
MEPs insisted that any moves to block access to images on the web must be accompanied by transparent procedures and provide adequate safeguards so that the restriction is limited to what is necessary and proportionate .
The safeguards would include informing users of the reason for the block and informing content providers and users of their right to appeal.
The European Court of Justice has given a preliminary opinion that will have far-reaching implications in the fight against
overaggressive copyright monopoly abusers. It is not a final verdict, but the Advocate General's position; the Court generally follows this. The Advocate General says that no ISP can be required to filter the Internet, and particularly not to
enforce the copyright monopoly.
The opinion is very clear: Advocate General Cruz Villalon considers that the installation of that filtering and blocking system is a restriction on the right to respect for the privacy of communications and the right to protection of personal
data, both of which are rights protected under the Charter of Fundamental Rights. By the same token, the deployment of such a system would restrict freedom of information, which is also protected by the Charter of Fundamental Rights.
Broadband providers have voiced alarm over an EU proposal to create a Great Firewall of Europe by blocking illicit web material
at the borders of the bloc.
The proposal emerged an obscure meeting of the Council of the European Union's Law Enforcement Work Party (LEWP), a forum for cooperation on issues such as counter terrorism, customs and fraud.
The minutes from the meeting state:
The Presidency of the LEWP presented its intention to propose concrete measures towards creating a single secure European cyberspace with a certain virtual Schengen border and virtual access points whereby the
Internet Service Providers (ISP) would block illicit contents on the basis of the EU black-list . Delegations were also informed that a conference on cyber-crime would be held in Budapest on 12-13 April 2011.
Malcolm Hutty, head of public affairs at LINX, a cooperative of British ISPs, said the plan appeared ill thought-out and confused . We take the view that network level filtering of the type proposed has been proven ineffective.
Broadband providers say that illegal content should be removed at the source by cooperation between police and web hosting firms because network blocking can easily be circumvented.
The Committee on Civil Liberties, Justice and Home Affairs (LIBE) of the European Parliament has adopted a compromise text agreed with the Council and the Commission on the draft Child Sexual Exploitation Directive. The compromise text allows Member
States to introduce mandatory blocking measures for Internet sites containing child abuse images, but does not require them as the Council had proposed.
Article 21: Measures against websites containing or disseminating child pornography:
Member States shall take the necessary measures to ensure the prompt removal of webpages containing or disseminating child pornography hosted in their territory and to endeavour to obtain the removal of such pages hosted outside of
Member States may take measures to block access to webpages containing or disseminating child pornography towards the Internet users in their territory. These measures must be set by transparent procedures and provide adequate
safeguards, in particular to ensure that the restriction is limited to what is necessary and proportionate, and that users are informed of the reason for the restriction. These safeguards shall also include the possibility of judicial redress.
Civil liberties groups will be pleased at having defeated mandatory blocking across Europe, but disappointed at having failed to ensure that judicial authority is required before an ISP can be forced to block an Internet address.
The draft Directive is due to be adopted in the Autumn.
The European Parliament has approved new rules that will implement tough penalties for offences related to child
porn online. The resolution was adopted by the European Parliament with 541 votes in favor and two against.
The directive will require EU countries to remove child porn websites or allow them to block access to those pages. EU member states will have two years to make the rules into national law.
The new rules will outline requirements on prevention, prosecution of offenders and protection of victims and
Association of Sites Advocating Child Protection Executive Director Tim Henning said:
It covers all the major bases and will make it less difficult for EU authorities to prosecute these heinous crimes against children. It will also help to reduce the proliferation and consumption of child pornography content.
But he noted one troubling aspect of blocking suspected website pages:
This needs to be completely transparent in order to prevent EU territories from blocking legal adult entertainment that may be mistaken for illegal child porn. The directive has stated this will be the case.
The rules set out penalties for about 20 criminal offenses. For instance, coercing a child into sexual actions or forcing a child into prostitution will be punishable by at least 10 years in prison. Child pornography producers will face at least 3
years, and viewers of online child pornography will face at least 1 year.
The Council of the EU has adopted a directive aimed at combating sexual abuse and exploitation of children as well as
The directive will harmonise around twenty relevant criminal offences, at the same time setting high level of penalties.
The new rules which have to be transposed into national law within two years also include provisions to fight against online child pornography and sex tourism. They also aim to prevent convicted paedophiles moving to another EU member state from
exercising professional activities involving regular contacts with children. Finally, the directive introduces measures to protect the child victim during investigations and legal proceedings.
Concerning online child pornography, the text obliges member states to ensure the prompt removal of such websites hosted in their territory and to endeavour to obtain their removal if hosted outside of their territory.
In addition, member states may block access to such web pages, but must follow transparent procedures and provide safeguards if they make use of this possibility.
Job vetting will also extend to a European wide level with a reliable check for EU nationals when applying for jobs related to the care of children. In addition, within the EU, higher protection of children will be achieved once member
states implement the directive and fully commit themselves to circulate data on disqualifications from their criminal records. It is currently very difficult to clear foreign EU nationals when applying for jobs related to the care of children.
In legal advice to the EU Court of Justice, Advocate General Pedro Cruz Villalon has announced that EU law allows for ISPs to be ordered to block their customers from accessing known copyright infringing sites.
The opinion, which relates to a dispute between a pair of movie companies and an Austrian ISP over the now-defunct site Kino.to, is not legally binding. However, the advice of the Advocate General is usually followed in such cases.
The current dispute involves Austrian ISP UPC Telekabel Wien and movie companies Constantin Film Verleih and Wega Filmproduktionsgesellschaft. The film companies complained that the ISP was providing its subscribers with access to Kino.to which enabled
them to access their copyrighted material without permission.
Interim injunctions were granted in the movie companies' favor which required the ISP to block the site. However, the Austrian Supreme Court later issued a request to the Court of Justice to clarify whether a provider that provides Internet access to
those using an illegal website were to be regarded as an intermediary, in the same way that the host of an illegal site might.
In his opinion, Advocate General Pedro Cruz Villalon said that the ISP of a user accessing a website said to be infringing copyright should also be regarded as an intermediary whose services are used by a third party, such as the operator of an
infringing website. This means that the ISP of an infringing site user can be subjected to a blocking injunction, as long as it contain specifics on the technicalities.
The European Parliament is currently considering EU wide website blocking powers.
The latest draft of the directive on combating terrorism contains proposals on blocking websites that promote or incite terror attacks. Member states may take all necessary measures to remove or to block access to webpages publicly inciting to commit
terrorist offences, says text submitted by German MEP and rapporteur Monika Hohlmeier.
Digital rights activists have argued that it leaves the door wide open to over-blocking and censorship as safeguards defending proportionality and fundamental rights can be skipped if governments opt for voluntary schemes implemented by ISPs.
Amendments have been proposed that would require any take down or Web blocking to be subject to full judicial oversight and rubber stamping.
Last week, Estonian MEP Marju Lauristin told Ars she was very disappointed with the text, saying it was jeopardising freedom of expression as enshrined in the Charter of Fundamental Rights of EU.
The measure will be up for a vote by the civil liberties committee on 27th June.
ISPs that block access to websites with adult content or block ads could be breaking EU guidelines on net neutrality even if customers opt in.
EU regulations only allow providers to block content for three reasons: to comply with a member state's laws, to manage levels of traffic across a network, or for security.
Blocking websites with adult content has no clear legal framework in UK legislation, and providers have relied on providing the ability to opt in to protect themselves from falling foul of the rules. However, an update to guidelines issued by EU body
Berec says that even if a person indicates they want certain content to be blocked, it should be done on their device, rather than at a network level. The updated guidelines say:
With regard to some of the suggestions made by stakeholders about traffic management features that could be requested or controlled by end-users, Berec notes that the regulation does not consider that end-user consent enables ISPs to engage in such
practices at the network level.
End-users may independently choose to apply equivalent features, for example via their terminal equipment or more generally on the applications running at the terminal equipment, but Berec considers that management of such features at the network level
would not be consistent with the regulation.
Frode Sorensen, co-chair of the Berec expert working group on net neutrality said the updated guidance made it clear that it had found no legal basis for using customer choice to justify blocking any content without national legislation or for reasons of
traffic management or security.
David Cameron said in October last year that he had secured an opt-out from the rules enabling British internet providers to introduce porn filters. However, Sorensen said he was not aware of any opt-out, and the net neutrality rules introduced in
November, after Cameron made his claim, said they applied to the whole European Economic Area which includes the UK.
Social media giants Facebook, Google and Twitter will be forced to change their terms of service for EU users within a
month, or face hefty fines from European authorities, an official said on Friday.
The move was initiated after politicians have decided to blame their unpopularity on 'fake news' rather than their own incompetence and their failure to listen to the will of the people.
The EU Commission sent letters to the three companies in December, stating that some terms of service were in breach of EU protection laws and urged them to do more to prevent fraud on their platforms. The EU has also urged social media companies
to do more when it comes to assessing the suitability of user generated content.
The letters, seen by Reuters, explained that the EU Commission also wanted clearer signposting for sponsored content, and that mandatory rights, such as cancelling a contract, could not be interfered with.
Germany said this week it is working on a new law that would see social media sites face fines of up to $53 million if they failed to strengthen their efforts to remove material that the EU does not like. German censorship minister Heiko Mass
There must be as little space for criminal incitement and slander on social networks as on the streets. Too few criminal comments are deleted and they are not erased quickly enough. The biggest problem is that networks do not take the complaints
of their own users seriously enough...it is now clear that we must increase the pressure on social networks.
Article 13: Monitoring and filtering of internet content is unacceptable. Index on Censorship joined with 56 other NGOs to call for the deletion of Article
13 from the proposal on the Digital Single Market, which includes obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens' fundamental rights.
Dear President Juncker,
Dear President Tajani,
Dear President Tusk,
Dear Prime Minister Ratas,
Dear Prime Minister Borissov,
Dear MEP Voss, MEP Boni
The undersigned stakeholders represent fundamental rights organisations.
Fundamental rights, justice and the rule of law are intrinsically linked and constitute core values on which the EU is founded. Any attempt to disregard these values undermines the mutual trust between member states required for the EU to
function. Any such attempt would also undermine the commitments made by the European Union and national governments to their citizens.
Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens' fundamental rights.
Article 13 introduces new obligations on internet service providers that share and store user-generated content, such as video or photo-sharing platforms or even creative writing websites, including obligations to filter uploads to their services.
Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens' communications if they are to have any chance of staying in business.
Article 13 contradicts existing rules and the case law of the Court of Justice. The Directive of Electronic Commerce ( 2000/31/EC) regulates the liability for those internet companies that host content on behalf of their users. According to
the existing rules, there is an obligation to remove any content that breaches copyright rules, once this has been notified to the provider.
Article 13 would force these companies to actively monitor their users' content, which contradicts the 'no general obligation to monitor' rules in the Electronic Commerce Directive. The requirement to install a system for filtering electronic
communications has twice been rejected by the Court of Justice, in the cases Scarlet Extended ( C 70/10) and Netlog/Sabam (C 360/10). Therefore, a legislative provision that requires internet companies to install a filtering system would
almost certainly be rejected by the Court of Justice because it would contravene the requirement that a fair balance be struck between the right to intellectual property on the one hand, and the freedom to conduct business and the right to freedom
of expression, such as to receive or impart information, on the other.
In particular, the requirement to filter content in this way would violate the freedom of expression set out in Article 11 of the Charter of Fundamental Rights. If internet companies are required to apply filtering mechanisms in order to
avoid possible liability, they will. This will lead to excessive filtering and deletion of content and limit the freedom to impart information on the one hand, and the freedom to receive information on the other.
If EU legislation conflicts with the Charter of Fundamental Rights, national constitutional courts are likely to be tempted to disapply it and we can expect such a rule to be annulled by the Court of Justice. This is what happened with the
Data Retention Directive (2006/24/EC), when EU legislators ignored compatibility problems with the Charter of Fundamental Rights. In 2014, the Court of Justice declared the Data Retention Directive invalid because it violated the Charter.
Taking into consideration these arguments, we ask the relevant policy-makers to delete Article 13.
European Digital Rights (EDRi)
Associação D3 -- Defesa dos Direitos Digitais
Associação Nacional para o Software Livre (ANSOL)
Association for Progressive Communications (APC)
Association for Technology and Internet (ApTI)
Association of the Defence of Human Rights in Romania (APADOR)
Bangladesh NGOs Network for Radio and Communication (BNNRC)
Bits of Freedom (BoF)
Bulgarian Helsinki Committee
Center for Democracy & Technology (CDT)
Centre for Peace Studies
Coalizione Italiana Liberta@ e Diritti Civili (CILD)
Code for Croatia
Culture Action Europe
Electronic Frontier Foundation (EFF)
Estonian Human Rights Centre
Freedom of the Press Foundation
Frënn vun der Ënn
Helsinki Foundation for Human Rights
Hermes Center for Transparency and Digital Human Rights
Human Rights Monitoring Institute
Human Rights Watch
Human Rights Without Frontiers
Hungarian Civil Liberties Union
Index on Censorship
International Partnership for Human Rights (IPHR)
International Service for Human Rights (ISHR)
Justice & Peace
La Quadrature du Net
Media Development Centre
Miklos Haraszti (Former OSCE Media Representative)
Modern Poland Foundation
Netherlands Helsinki Committee
One World Platform
Open Observatory of Network Interference (OONI)
Open Rights Group (ORG)
Plataforma en Defensa de la Libertad de Información (PDLI)
Reporters without Borders (RSF)
Rights International Spain
South East Europe Media Organisation (SEEMO)
South East European Network for Professionalization of Media (SEENPM)
The Right to Know Coalition of Nova Scotia (RTKNS)
The European Union is in the process of creating an authority to monitor and censor so-called fake news. It is setting up a High-Level 'Expert'
Group. The EU is currently consulting media professionals and the public to decide what powers to give to this EU body, which is to begin operation next spring.
The World Socialist Web Site
has its own colourful view on the intentions of the body, but I don't suppose it is too far from the truth:
An examination of the EU's announcement shows that it is preparing mass state censorship aimed not at false information, but at news reports or political views that encourage popular opposition to the European ruling class.
It aims to create conditions where unelected authorities control what people can read or say online.
EU Vice-President Frans Timmermans explained the move in ominous tersm
We live in an era where the flow of information and misinformation has become almost overwhelming. The EU's task is to protect its citizens from fake news and to manage the information they receive.
According to an EU press release, the EU Commission, another unelected body, will select the High-Level Expert Group, which is to start in January 2018 and will work over several months. It will discuss possible future actions to strengthen
citizens' access to reliable and verified information and prevent the spread of disinformation online.
Who will decide what views are verified, who is reliable and whose views are disinformation to be deleted from Facebook or removed from Google search results? The EU, of course.
The European Union voted on November 14, to pass the new internet censorship regulation nominally in the name of consumer protection. But of course
censorship often hides behind consumer protection, eg the UK's upcoming internet porn ban is enacted in the name of protecting under 18 internet consumers.
The new EU-wide law gives extra power to national consumer protection agencies, but which also contains a vaguely worded clause that also grants them the power to block and take down websites without judicial oversight.
Member of the European Parliament Julia Reda said in a speech in the European Parliament Plenary during a last ditch effort to amend the law:
The new law establishes overreaching Internet blocking measures that are neither proportionate nor suitable for the goal of protecting consumers and come without mandatory judicial oversight,
According to the new rules, national consumer protection authorities can order any unspecified third party to block access to websites without requiring judicial authorization, Reda added later in the day on her blog .
This new law is an EU regulation and not a directive, meaning its obligatory for all EU states.
The new law proposal started out with good intentions, but sometimes in the spring of 2017, the proposed regulation received a series of amendments that watered down some consumer protections but kept intact the provisions that ensured national
consumer protection agencies can go after and block or take down websites.
Presumably multinational companies had been lobbying for new weapons n their battle against copyright infringement. For instance, the new law gives national consumer protection agencies the legal power to inquire and obtain information about
domain owners from registrars and Internet Service Providers.
Besides the website blocking clause, authorities will also be able to request information from banks to detect the identity of the responsible trader, to freeze assets, and to carry out mystery shopping to check geographical discrimination or
Comment: European Law Claims to Protect Consumers... By Blocking the Web
The Consumer Protection Regulation provides in Article 8(3)(e) that consumer protection authorities must have the power:
where no other effective means are available to bring about the cessation or the prohibition of the infringement including by requesting a third party or other public authority to implement such measures, in order to prevent the risk of serious
harm to the collective interests of consumers:
to remove content or restrict access to an online interface or to order the explicit display of a warning to consumers when accessing the online interface;
to order a hosting service provider to remove, disable or restrict the access to an online interface; or
where appropriate, order domain registries or registrars to delete a fully qualified domain name and allow the competent authority concerned to register it;
The risks of unelected public authorities being given the power to block websites was powerfully demonstrated in 2014, when the Australian company regulator ASIC
accidentally blocked 250,000 websites
in an attempt to block just a handful of sites alleged to be defrauding Australian consumers.
This likelihood of unlawful overblocking is just one of the reasons that the United Nations Special Rapporteur for Freedom of Expression and Opinion has underlined how web blocking often contravenes international human rights law. In a
[PDF], then Special Rapporteur Frank La Rue set out how extremely limited are the circumstances in which blocking of websites can be justified, noting that where:
the specific conditions that justify blocking are not established in law, or are provided by law but in an overly broad and vague manner, [this] risks content being blocked arbitrarily and excessively. ... [E]ven where justification is provided,
blocking measures constitute an unnecessary or disproportionate means to achieve the purported aim, as they are often not sufficiently targeted and render a wide range of content inaccessible beyond that which has been deemed illegal. Lastly,
content is frequently blocked without the intervention of or possibility for review by a judicial or independent body.
This describes exactly what the new Consumer Protection Regulation will do. It hands over a power that should only be exercised, if at all, under the careful scrutiny of a judge in the most serious of cases, and allows it to be wielded at the whim
of an unelected consumer protection agency. As explained by Member of the European Parliament (MEP) Julia Reda
, who voted against the legislation, it sets the stage for the construction of a censorship infrastructure that could be misused for purposes that we cannot even anticipate, ranging from copyright enforcement through to censorship of political
Regrettably, the Regulation is now law--and is required to be enforced by all European states. It is both ironic and tragic that a law intended to protect consumers actually poses such a dire threat to their right to freedom of expression.
The third evaluation of the EU's 'Code of Conduct' on censoring 'illegal online hate speech' carried out by NGOs and
public bodies shows that IT companies removed on average 70% of posts claimed to contain 'illegal hate speech'.
However, some further challenges still remain, in particular the lack of systematic feedback to users.
Google+ announced today that they are joining the Code of Conduct, and Facebook confirmed that Instagram would also do so, thus further expanding the numbers of actors covered by it.
Vera Jourová, with the oxymoronic title of EU Commissioner for Justice, Consumers and Gender Equality, said:
The Internet must be a safe place, free from illegal hate speech, free from xenophobic and racist content. The Code of Conduct is now proving to be a valuable tool to tackle illegal content quickly and efficiently. This shows that where there is
a strong collaboration between technology companies, civil society and policy makers we can get results, and at the same time, preserve freedom of speech. I expect IT companies to show similar determination when working on other important issues,
such as the fight with terrorism, or unfavourable terms and conditions for their users.
On average, IT companies removed 70% of all the 'illegal hate speech' notified to them by the NGOs and public bodies participating in the evaluation. This rate has steadily increased from 28% in the first monitoring round in 2016 and 59% in the
second monitoring exercise in May 2017.T
The Commission will continue to monitor regularly the implementation of the Code by the participating IT Companies with the help of civil society organisations and aims at widening it to further online platforms. The Commission will consider
additional measures if efforts are not pursued or slow down.
Of course no mention of the possibility that some of the reports of supposed 'illegal hate speech' are not actioned because they are simply wrong and may be just the politically correct being easily offended. We seem to live in an injust age where
the accuser is always considered right and the merits of the case count for absolutely nothing.
A few MEPs produce YouTube video highlighting the corporate and state censorship that will be enabled by an EU proposal to require social media posts to be approved before posting by an automated censorship machine
In a new campaign video, several Members of the European Parliament warn that the EU's proposed
mandatory upload filters pose a threat to freedom of speech. The new filters would function as censorship machines which are "completely disproportionate," they say. The MEPs encourage the public to speak up, while they still can.
Through a series of new proposals, the European Commission is working hard to modernize
EU copyright law. Among other things, it will require online services to do more to fight piracy.
These proposals have not been without controversy. Article 13 of the proposed Copyright Directive, for example, has been widely criticized as it would require online services to monitor and filter uploaded content.
This means that online services, which deal with large volumes of user-uploaded content, must use fingerprinting or other detection mechanisms -- similar to YouTube's Content-ID system -- to block copyright infringing files.
The Commission believes that more stringent control is needed to support copyright holders. However, many legal
, digital activists
, and members of the public worry that they will violate the rights of regular Internet users.
In the European Parliament, there is fierce opposition as well. Today, six Members of Parliament (MEPs) from across the political spectrum released a new campaign video warning their fellow colleagues and the public at large.
The MEPs warn that such upload filters would act as censorship machines, something they've made clear to the Council's working group on intellectual property, where the controversial proposal was discussed today.
Imagine if every time you opened your mouth, computers controlled by big companies would check what you were about to say, and have the power to prevent you from saying it, Greens/EFA MEP Julia Reda says.
A new legal proposal would make this a reality when it comes to expressing yourself online: Every clip and every photo would have to be pre-screened by some automated 'robocop' before it could be uploaded and seen online, ALDE MEP Marietje
Stop censorship machines!
Schaake notes that she has dealt with the consequences of upload filters herself. When she uploaded a recording of a political speech to YouTube, the site took it down without explanation. Until this day, the MEP still doesn't know on what grounds
it was removed.
These broad upload filters are completely disproportionate and a danger for freedom of speech, the MEPs warn. The automated systems make mistakes and can't properly detect whether something's fair use, for example.
Another problem is that the measures will be relatively costly for smaller companies ,which puts them at a competitive disadvantage. "Only the biggest platforms can afford them -- European competitors and small businesses will struggle,"
ECR MEP Dan Dalton says.
The plans can still be stopped, the MEPs say. They are currently scheduled for a vote in the Legal Affairs Committee at the end of March, and the video encourages members of the public to raise their voices.
Speak out ...while you can still do so unfiltered! S&D MEP Catherine Stihler says.
Illegal content and terrorist propaganda are still spreading rapidly online in the European Union -- just not on mainstream platforms, new analysis shows.
Twitter, Google and Facebook all play by EU rules when it comes to illegal content, namely hate speech and terrorist propaganda, policing their sites voluntarily.
But with increased scrutiny on mainstream sites, alt-right and terrorist sympathizers are flocking to niche platforms where illegal content is shared freely, security experts and anti-extremism activists say.
The European Union has given Google, YouTube, Facebook, Twitter and other internet companies three months to show that
they are removing extremist content more rapidly or face legislation forcing them to do so.
The European Commission said on Thursday that internet firms should be ready to remove extremist content within an hour of being notified and recommended measures they should take to stop its proliferation. Digital commissioner Andrus Ansip said:
While several platforms have been removing more illegal content than ever before ... we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens' security, safety and fundamental
The EC said that it would assess the need for legislation of technology firms within three months if demonstrable improvement is not made on what it describes as terrorist content. For all other types of 'illegal' content the EC will assess the
technology firms' censorship progress within six months.
It also urged the predominantly US-dominated technology sector to adopt a more proactive approach, with automated systems to detect and censor 'illegal' content.
The European Commission proposes designating internet censors, which it euphemistically calls 'trusted flaggers', and then requiring internet hosting companies to censor whatever the 'trusted flaggers' say
The EU Commission has recommended an internet censorship decision sounding like something straight out of China. The
system consists of designating police, state censors, commercial censors acting for the state, and perhaps independent groups like the IWF. These are euphemistically known as trusted flaggers.
Website and content hosting companies will then be required to remove any content (nominally illegal content) in a timely manner.
The IWF usefully summarises the proposals as follows:
The EU Commission's proposals to tackle illegal content online include:
Hosting providers and Member States being prepared to submit all monitoring information to the Commission, upon request, within six months (three months for terrorist content) in order for the Commission to assess whether
further legislation is required.
Recommends introducing definitions for "illegal content" and "trusted flaggers".
Fast track procedures should be introduced for materials referred by trusted flaggers.
Hosting providers to publish a list of who they consider to be a "trusted flagger".
Automated takedown of content is encouraged, but should have safeguards such as human oversight.
Terrorist content should be removed within one hour.
A loss of trust in Facebook in the light of the Cambridge Analytica scandal could prompt the EU to
scrap its voluntary code of conduct on the removal of online hate speech in favour of legislation and heavy sanctions, European commission Vera Jourová said.
The EU's executive is examining how to have hateful content censored swiftly by social media platforms, with legislation being one option that could replace the current system.
JJourová said she would be grilling Sheryl Sandberg , Facebook's chief operating officer, later this week over unanswered questions about the company's past errors and future plans.
Jourove said she was wary of following the German path, because of the thin line between removing offensive material and censorship, but said all options were on the table.
Brussels may threaten social media companies with censorship laws unless they move urgently to tackle supposed 'fake
news' and Cambridge Analytica-style data abuse.
The EU security commissioner, Julian King, said short-term, concrete plans needed to be in place before the elections, when voters in 27 EU member states will elect MEPs.
Under King's ideas, social media companies would sign a voluntary code of conduct to prevent the misuse of platforms to pump out misleading information.
The code would include a pledge for greater transparency, so users would be made aware why their Facebook or Twitter feed was presenting them with certain adverts or stories. Another proposal is for political adverts to be accompanied with
information about who paid for them.