Laetitia Avia was hailed as a symbol of French diversity when she entered parliament for Emmanuel Macron' s centrist party in 2017. But the daily racist abuse against her on social networks pushed her to draw up an extreme censorship law to put a stop to
It states that hateful comments reported by users must be removed within 24 hours by platforms such as Twitter, Facebook or YouTube. This includes any hateful attack on someone's dignity on the basis of race, religion, sexual orientation, gender
identity or disability. If the social media platforms and tech companies do not comply, they will face huge fines of up to 4% of their global revenue. Penalties could reach tens of millions of euros. There will also be a new judiciary body to focus on
The online hatred bill will be debated by the French parliament next week and could be fast-tracked into force in the autumn.
The bill is part of Macron's drive to internet censorship. He announced the planned crackdown on online hate at a dinner for Jewish groups last year, amid a rise of antisemitic acts in France, saying that hateful content online must be taken down fast
and all possible techniques put in place to find the identities of those behind it.
Last month, after meetings with Macron, Facebook's Mark Zuckerberg agreed to hand over to judges the identification data on its French users suspected of hate speech.
The French law to censor politically incorrect insults on social media websites by the National Assembly on Friday.
Under the French draft law, social media groups would have to put in place tools to allow users to alert them to clearly illicit content related to race, gender, religion, sexual orientation or disability.
In the event a network fails to react in due course and/or offer the necessary means to report such content, they could face fines up to 4 per cent of their global revenues.
France's broadcasting censor, CSA, would be responsible for imposing the sanctions and a dedicated prosecutor's office would be created.
Several internet and freedom of speech advocacy groups have pointed out that bill paves the way for state censorship because it does not clearly define illicit content.
Imposing a 24-hour limit to remove clearly unlawful content is likely to result in significant restrictions on freedoms, such as the overblocking of lawful comments or the misuse of the measure for political censorship purposes, said Quadrature du
Net, a group that advocates free speech on the internet.
The group also highlighted that a law adopted in 2004 already demanded the removal of hateful content, but in a responsive way, leaving enough time to platforms for assessing the seriousness of the content under review.
The bill now passes to the French Senate for further debate.
The Broadcasting Authority of Ireland will police video content on Facebook under new proposals before the Irish Government.
The Sunday Independent reports the BAI aims to become an enlarged media commission to enforce European censorship rules.
The BAI currently regulates Irish commercial radio and television as well as RTE and TG4.
With the social media giants based in Ireland, it will now regulate content on Facebook, Twitter and YouTube in Ireland and throughout the EU.
The BAI proposals also want an Online Safety Commissioner to form part of its increased censorship role. They also speak pf age verification, parental controls and a complaints mechanism.
The Government is also keen to emulate the UK internet porn censorship regime. Irish MP Leo Varadkar said the Irish government will consult with the UK about its new porn block and how it is working, with a view to perhaps rolling out a similar age
verification system for Ireland.
Varadkar said that he was wary of moralising . ..BUT... suggested engagement with UK government a year or two after the law has been rolled out would be wise. He said that this engagement could help ascertain if the proposals could work
During Leaders' Questions, he confirmed that an online age verification system can be discussed by the Oireachtas Communications Committee, and confirmed that legislation to set up the office of a Digital Safety Commissioner is on the way.
Justice Minister Charlie Flanagan has also said the Irish government will consider a similar system to the UK's porn block law as part of new legislation on online safety.
AN MP in Spain is leading an initiative to force porn websites operating in the country to install strict age verification systems.
The recently elected 26-year-old Andrea Fernandez has called to end the culture of porn among young people. The limitation of pornographic contents online was included in the electoral programme of the the newly elected Prime Minister, Pedro Sanchez
(Social Democrats). The goal of the new government is to implement a new strict age verification system for these kind of websites.
IFCO has published its annual report covering 2018.
It notes that teh number of cinema films passed is about the same as the previous year with 448 releases in 2018. However it reports that video DVD submissions (presumably including Blu-ray) has declined by 15% to 2621 submission in 2018.
IFCO reports on 2 appeals in 2018, both appeals were rejected and the rating remained unaltered. The two films were the 18 rated The First Purge , and the 12A rated Bumblebee.
The number of complaints received by IFCO has always been minimal. IFCO writes:
During 2018, IFCO received 18 complaints from the public which related specifically to classifications awarded. The most received in respect of any one title was 6 in the case of SHOW DOGS, a comedy classified PG for Mild violence,
language and rude humour. Of these, two were from people who had not seen the film.
IFCO has also just upgraded its website to make it a bit smarter. IFCO acknowledged that it needs to up its game in interacting with the public. IFCO wrote in the report:
It is to be hoped that the updated website will be more visited and perhaps encourage people to contact IFCO. All constructive input, whether positive or negative is very welcome and informs as to people's expectations of IFCO service
The idea of an open global internet keeps taking a beating -- and the worst offender is not, say, China or Russia, but rather the EU.
We've already discussed things like the EU Copyright Directive and the Terrorist Content Regulation , but it seems like every day there's something new and more ridiculous -- and the latest may be coming from the Court of Justice of the EU (CJEU). The
CJEU's Advocate General has issued a recommendation (but not the final verdict) in a new case that would be hugely problematic for the idea of a global open internet that isn't weighted down with censorship.
The case at hand involved someone on Facebook posting a link to an article about an Austrian politician, Eva Glawischnig-Piesczek, accusing her of being a lousy traitor of the people, a corrupt oaf and a member of a fascist party.
An Austrian court ordered Facebook to remove the content, which it complied with by removing access to anyone in Austria. The original demand was also that Facebook be required to prevent equivalent content from appearing as well. On appeal, a court
denied Facebook's request that it only had to comply in Austria, and also said that such equivalent content could only be limited to cases where someone then alerted Facebook to the equivalent content being posted (and, thus, not a general monitoring
The case was then escalated to the CJEU and then, basically everything goes off the rails
As governments around the world seek greater influence over the Web, the European Union has emerged as a model of legislative intervention, with efforts from GDPR to the Right to be Forgotten to new efforts to allow EU lawmakers
to censor international criticism of themselves. GDPR has backfired spectacularly, stripping away the EU's previous privacy protections and largely exempting the most dangerous and privacy-invading activities it was touted to address. Yet it is the EU's
efforts to project its censorship powers globally that present the greatest risk to the future of the Web and demonstrate just how little the EU actually understands about how the internet works.
A Polish court has held a first hearing in a case brought against Facebook by a historian who says that Facebook engaged in censorship by suspending accounts that had posted about a nationalist rally in Warsaw.
Historian Maciej Swirski has complained that Facebook in 2016 suspended a couple of accounts that provided information on an independence day march organised by far-right groups. Swirski told AFP:
I'm not a member of the National Movement, but as a citizen I wanted to inform myself on the event in question and I was blocked from doing so,
This censorship doesn't concern my own posts, but rather content that I had wanted to see.
Facebook's lawyers argued that censorship can only be exercised by the state and that a private media firm is not obligated to publish any particular content.
The next court hearing will take place on October 30.
Prior to the European Parliament elections, popular YouTube users in Germany appealed to their followers to boycott the Christian Democratic Union (CDU), the Social Democrats (SPD) and the Alternative fur Deutschland (AfD).
Following a miserable election result, CDU leader Annegret Kramp-Karrenbauer made statements suggesting that in the future, such opinions may be censored.
Popular German YouTube star Rezo urged voters to punish the CDU and its coalition partner by not voting for them. Rezo claimed that the government's inactions on critical issues such as climate change, security and intellectual property rights are
destroying our lives and our future.
Rezo quickly found the support of 70 other influential YouTube presenters. But politicians accused him of misrepresenting information and lacking credibility in an effort to discredit him. Nonetheless, his video had nearly 4 million views by Sunday,
the day of the election.
Experts like Prof. J3crgen Falter of the University of Mainz believe that Renzo's video swayed the opinions of many undecided voters, especially those under age 30.
Kramp-Karrenbauer commented on it during a press conference:
What would have happened in this country if 70 newspapers decided just two days before the election to make the joint appeal: 'Please don't vote for the CDU and SPD ? That would have been a clear case of political bias before the
What are the rules that apply to opinions in the analog sphere? And which rules should apply in the digital sphere?
She concluded that these topics will be discussed by the CDU , saying:
I'm certain, they'll play a role in discussions surrounding media policy and democracy in the future.
Many interpreted her statements as an attack on freedom of speech and a call to censor people's opinions online. Ria Schröder, head of the Young Liberals, wrote:
The CDU's understanding of democracy If you are against me, I censor you is incomprehensible!
The right of a user on YouTube or other social media to discuss his or her political view is covered by Germany's Basic Law, which guarantees freedom of speech.
Kramp-Karrenbauer's statements may threaten her chance for the chancellorship. More importantly, they expose the mindset of Germany's political leadership.
Poland is challenging the EU's copyright directive in the EU Court of Justice (CJEU) on grounds of its threats to freedom of speech on the internet, Foreign Minister Jacek Czaputowicz said on Friday.
The complaint especially addresses a mechanism obliging online services to run preventive checks on user content even without suspicion of copyright infringement. Czaputowicz explained at a press conference in Warsaw:
Poland has charged the copyright directive to the CJEU, because in our opinion it creates a fundamental threat to freedom of speech on the internet. Such censorship is forbidden both by the Polish constitution and EU law. The Charter
of Fundamental Rights (of the European Union - PAP) guarantees freedom of speech.
The directive is to change the way online content is published and monitored. EU members have two years to introduce the new regulations. Against the directive are Poland, Holland, Italy, Finland and Luxembourg.
Ireland's Justice Minister Charlie Flanagan confirmed that the Irish government will consider a similar system to the UK's so-called porn block law as part of new legislation on online safety. Flanagan said:
I would be very keen that we would engage widely to ensure that Ireland could benefit from what is international best practice here and that is why we are looking at what is happening in other jurisdictions.
The Irish communications minister Richard Bruton said there are also issues around privacy laws and this has to be carefully dealt with. H said:
It would be my view that government through the strategy that we have published, we have a cross-government committee who is looking at policy development to ensure online safety, and I think that forum is the forum where I believe we
will discuss what should be done in that area because I think there is a genuine public concern, it hasn't been the subject of the Law Reform Commission or other scrutiny of legislation in this area, but it was worthy of consideration, but it does have
its difficulties, as the UK indeed has recognised also.
The internet technology known as deep packet inspection is currently illegal in Europe, but big telecom companies doing business in the European Union want to change that. They want deep packet inspection permitted as part of the new net neutrality rules
currently under negotiation in the EU, but on Wednesday, a group of 45 privacy and internet freedom advocates and groups published an open letter warning against the change:
Dear Vice-President Andrus Ansip, (and others)
We are writing you in the context of the evaluation of Regulation (EU) 2015/2120 and the reform of the BEREC Guidelines on its implementation. Specifically, we are concerned because of the increased use of Deep Packet Inspection (DPI)
technology by providers of internet access services (IAS). DPI is a technology that examines data packets that are transmitted in a given network beyond what would be necessary for the provision IAS by looking at specific content from the part of the
user-defined payload of the transmission.
IAS providers are increasingly using DPI technology for the purpose of traffic management and the differentiated pricing of specific applications or services (e.g. zero-rating) as part of their product design. DPI allows IAS providers
to identify and distinguish traffic in their networks in order to identify traffic of specific applications or services for the purpose such as billing them differently throttling or prioritising them over other traffic.
The undersigned would like to recall the concerning practice of examining domain names or the addresses (URLs) of visited websites and other internet resources. The evaluation of these types of data can reveal sensitive information
about a user, such as preferred news publications, interest in specific health conditions, sexual preferences, or religious beliefs. URLs directly identify specific resources on the world wide web (e.g. a specific image, a specific article in an
encyclopedia, a specific segment of a video stream, etc.) and give direct information on the content of a transmission.
A mapping of differential pricing products in the EEA conducted in 2018 identified 186 such products which potentially make use of DPI technology. Among those, several of these products by mobile operators with large market shares are
confirmed to rely on DPI because their products offer providers of applications or services the option of identifying their traffic via criteria such as Domain names, SNI, URLs or DNS snooping.
Currently, the BEREC Guidelines3 clearly state that traffic management based on the monitoring of domain names and URLs (as implied by the phrase transport protocol layer payload) is not reasonable traffic management under the
Regulation. However, this clear rule has been mostly ignored by IAS providers in their treatment of traffic.
The nature of DPI necessitates telecom expertise as well as expertise in data protection issues. Yet, we observe a lack of cooperation between national regulatory authorities for electronic communications and regulatory authorities
for data protection on this issue, both in the decisions put forward on these products as well as cooperation on joint opinions on the question in general. For example, some regulators issue justifications of DPI based on the consent of the customer of
the IAS provider which crucially ignores the clear ban of DPI in the BEREC Guidelines and the processing of the data of the other party communicating with the subscriber, which never gave consent.
Given the scale and sensitivity of the issue, we urge the Commission and BEREC to carefully consider the use of DPI technologies and their data protection impact in the ongoing reform of the net neutrality Regulation and the
Guidelines. In addition, we recommend to the Commission and BEREC to explore an interpretation of the proportionality requirement included in Article 3, paragraph 3 of Regulation 2015/2120 in line with the data minimization principle established by the
GDPR. Finally, we suggest to mandate the European Data Protection Board to produce guidelines on the use of DPI by IAS providers.
European Digital Rights, Europe Electronic Frontier Foundation, International Council of European Professional Informatics Societies, Europe Article 19, International Chaos Computer Club e.V, Germany epicenter.works - for digital
rights, Austria Austrian Computer Society (OCG), Austria Bits of Freedom, the Netherlands La Quadrature du Net, France ApTI, Romania Code4Romania, Romania IT-Pol, Denmark Homo Digitalis, Greece Hermes Center, Italy X-net, Spain Vrijschrift, the
Netherlands Dataskydd.net, Sweden Electronic Frontier Norway (EFN), Norway Alternatif Bilisim (Alternative Informatics Association), Turkey Digitalcourage, Germany Fitug e.V., Germany Digitale Freiheit, Germany Deutsche Vereinigung f3cr Datenschutz e.V.
(DVD), Germany Gesellschaft f3cr Informatik e.V. (GI), Germany LOAD e.V. - Verein f3cr liberale Netzpolitik, Germany (And others)
A pair of entrepreneurs have been refused European trademark protection for their energy drink named Brexit after an EU body labelled it offensive.
Pawel Tumilowicz and Mariusz Majchrzak had attempted to register their product Brexit with the European Union Intellectual Property Office (Euipo) after they launched the drink in October 2016.
But they were denied on the grounds that EU citizens would be deeply offended by the appropriation of the word. Euipo claimed:
Citizens across the EU would be deeply offended if the expression at issue was registered as a European Union trade mark.
The pair then appealed before Euipo's Grand Board of Appea which rejected Euipo's judgement that the word was offensive. However it ruled that Brexit could not be trademarked because it was not distinctive enough under EU law and would be confusing.
The high-caffeine drink - which is described on its website as the only reasonable solution in this situation - is branded with the Union Jack and was only named after the contentious political event for a laugh, the Telegraph reports.
The German President Frank-Walter Steinmeier opened the re:publica 2019 conference in Berlin last week with a speech about internet censorship. The World Socialist Web Site reported the speech:
With cynical references to Germany's Basic Law and the right to freedom of speech contained within it, Steinmeier called for new censorship measures and appealed to the major technology firms to enforce already existing guidelines
He stated, The upcoming 70th anniversary of the German Basic Law reminds us of a connection that pre-dates online and offline: liberty needs rules--and new liberties need new rules. Furthermore, freedom of opinion brings with it
responsibility for opinion. He stressed that he knew there are already many rules, among which he mentioned the notorious Network Enforcement Law (Netz DG), but it will be necessary to argue over others.
He then added, Anyone who creates space for a political discussion with a platform bears responsibility for democracy, whether they like it or not. Therefore, democratic regulations are required, he continued. Steinmeier said that he
felt this is now understood in Silicon Valley. After a lot of words and announcements, discussion forums, and photogenic appearances with politicians, it is now time for Facebook, Twitter, YouTube and Co. to finally acknowledge their responsibility for
democracy, finally put it into practice.
Based on the results of an investigation by Privacy International, one of Europe's key data protection authorities has opened an inquiry into Quantcast, a major player in the online tracking industry.
The Irish Data Protection Commission has now opened statutory inquiry into Quantcast International Limited. The organisation writes:
Since the application of the GDPR significant concerns have been raised by individuals and privacy advocates concerning the conduct of technology companies operating in the online advertising sector and their compliance with the GDPR.
Arising from a submission to the Data Protection Commission by Privacy International, a statutory inquiry pursuant to section 110 of the Data Protection Action 2018 has been commenced in respect of Quantcast International Limited. The purpose of the
inquiry is to establish whether the company's processing and aggregating of personal data for the purposes of profiling and utilising the profiles generated for targeted advertising is in compliance with the relevant provisions of the GDPR. The GDPR
principle of transparency and retention practices will also be examined.
Netherlands-based publishing house Brill recently ended its distribution agreement with a Chinese state-run publisher, after the latter was found to have censored out a paper submitted to one of its journals
In a statement published on its website on April 25, Brill announced it would no longer partner with China's Higher Education Press to distribute four of its journals to customers outside China, effective in 2020.
The Dutch publishing house didn't provide an explanation for its decision.
The director of Poland's National Museum of Culture took it on himself to take down a classic 1975 artwork on the grounds that it might irritate sensitive young people.
Consumer Art or Body Art is a video by Natalia Lach-Lachowicz, who goes by the name Natalia LL, showing a bare-shouldered woman eating a banana in a rather suggestive fashion.
The director also removed was a 2005 video by Katarzyna Kozyra that showed a woman holding a leash attached to two men dressed as dogs on all fours. He explained:
Certain topics related to gender shouldn't be explicitly shown,
However the censorship by Museum Director Jerzy Miziolek was widely ridiculed. Many took to Instagram and other platforms to post photos of themselves eating bananas, including another prominent Polish artist, photographer Sylwia Kowalczyk, who
This should not happen to any artist, male or female, Kowalczyk told CNN . Natalia Lach-Lachowicz is one of the icons of the Polish contemporary art and has her place in art history already.
Hundreds of people also gathered to eat bananas outside Poland's national gallery in Warsaw on Monday to protest the censorship.
Responding to this public pressure Miziolek said that he would reinstate the Consumer Art exhibit -- but only for another week, when the museum begins a renovation project. Whether Consumer Art would return after the renovations are complete remained
Miziolek was appointed by the right-wing Law and Justice (PiS) government's Ministry of Culture. The ministry has consistently cut funding for the arts and fired arts staff who do not follow the party's line. However the ministry denied it was
involved in the decision to remove this artwork.
On the 15th of March, the German Bundesrat (Federal Council) voted to amend the Criminal Code in relation to internet based services such as The onion router (Tor).
The proposed law has been lambasted as being too vague, with privacy experts rightfully fearful that the law would be overapplied. The proposal, originating from the North Rhine-Westphalian Minister of Justice Peter Biesenbach, would
amend and expand criminal law and make running a Tor node or website illegal and punishable by up to three years in prison. According to Zeit.de, if passed, the expansion of the Criminal Code would be used to punish anyone who offers an internet-based
service whose access and accessibility is limited by special technical precautions, and whose purpose or activity is directed to commit or promote certain illegal acts.
What's worse is that the proposed changes are so vaguely worded that many other services that offer encryption could be seen as falling under this new law. While the proposal does seem to have been written to target Tor hidden
services which are dark net markets, the vague way that the proposal has been written makes it a very real possibility that other encrypted services such as messaging might be targeted under these new laws, as well.
Now that the motion to amend has been accepted by Bundesrat, it will be forwarded to the Federal Government for drafting, consideration, and comment. Then, within a month and a half, this new initiative will be forwarded to the German
Senate, aka the Bundestag, where it will be finally voted on. Private Internet Access and many others denounce this proposal and continue to support Tor and an open internet
Private Internet Access currently supports the Tor Project and runs a number of Tor exit nodes as a part of our commitment to online privacy. PIA believes this proposed amendment to the German Criminal Code is not just bad for Tor,
which was named specifically, but also for online privacy as a whole -- and we're not the only ones.
German criminal lawyer David Schietinger told Der Spiegel that he was concerned the law was too overreaching and could also mean an e-mail provider or the operator of a classic online platform with password protection.
The bill contains mainly rubber paragraphs with the clear goal to criminalize operators and users of anonymization services. Intentionally, the facts are kept very blurred. The intention is to create legal uncertainty and unavoidable
risks of possible criminal liability for anyone who supports the right to anonymous communication on the Internet.
It's not only China and the UK that want to identify internet users, Austria also wants to demand that forum contributors submit their ID before being able to post.
Austria's government has introduced a bill that would require larger social media websites and forums to obtain the identity of its users prior to them being able to post comments. Users will have to provide their name and address to websites but
nicknames are still allowed and the identity data will not be made public.
Punishments for non complying websites will be up to 500,000 euros and double that for repeat offences.
It would only affect sites with more than 100,000 registered users, bring in revenues above 500,000 euros per year or receive press subsidies larger than 50,000 euros.
There would also be exemptions for retail sites as well as those that don't earn money from either ads or the content itself.
If passed and cleared by the EU, the law would take effect in 2020. The immediate issues noted are that some of the websites most offending the sensitivities of the government are often smaller than the trigger condition. The law may also step on the
toes of the EU in rules governing which EU states has regulatory control over websites.
Update: Identity data will be available to other users
The law on care and responsibility on the net forces media platforms with forums to store detailed data about their users in order to deliver them in case of a possible offence not only to police authorities, but also to other users who want to
legally prosecute another forum user. Looking at the law in detail, it is obvious that they contain so many problematic passages that their intended purpose is completely undermined.
According to the Minister of Media, Gernot Blümel, harmless software will deal with the personal data processing. One of the risks of such a system would be the potential for abuse from public authorities or individuals requesting a platform provider
the person's name and address with the excuse to wanting to investigate or sue them, and then use the information for entirely other purposes.
The European Parliament has approved a draft version of new EU internet censorship law targeting terrorist content.
In particular the MEPs approved the imposition of a one-hour deadline to remove content marked for censorship by various national organisations. However the MEPs did not approve a key section of the law requiring internet companies to pre-process and
censor terrorsit content prior to upload.
A European Commission official told the BBC changes made to the text by parliament made the law ineffective. The Commission will now try to restore the pre-censorship requirement with the new parliament when it is elected.
The law would affect social media platforms including Facebook, Twitter and YouTube, which could face fines of up to 4% of their annual global turnover. What does the law say?
In amendments, the European Parliament said websites would not be forced to monitor the information they transmit or store, nor have to actively seek facts indicating illegal activity. It said the competent authority should give the website
information on the procedures and deadlines 12 hours before the agreed one-hour deadline the first time an order is issued.
In February, German MEP Julia Reda of the European Pirate Party said the legislation risked the surrender of our fundamental freedoms [and] undermines our liberal democracy. Ms Reda welcomed the changes brought by the European Parliament but said the
one-hour deadline was unworkable for platforms run by individual or small providers.
The EU Council of Ministers has approved the Copyright Directive, which includes the link tax and censorship machines. The legislation was voted through by a majority of EU ministers despite noble opposition from Italy, Luxembourg, Netherlands, Poland,
Finland, and Sweden.
As explained by Julia Reda MEP, a majority of 55% of Member States, representing 65% of the population, was required to adopt the legislation. That was easily achieved with 71.26% in favor, so the Copyright Directive will now pass into law.
As the image above shows, several countries voted against adoption, including Italy, Luxembourg, Netherlands, Poland, Finland, and Sweden. Belgium, Estonia, and Slovenia absta ined.
But in the final picture that just wasn't enough, with both Germany and the UK voting in favor, the Copyright Directive is now adopted.
EU member states will now have two years to implement the law, which requires platforms like YouTube to sign licensing agreements with creators in order to use their content. If that is not possible, they will have to ensure that infringing content
uploaded by users is taken down and not re-uploaded to their services.
The entertainment lobby will not stop here, over the next two years, they will push for national implementations that ignore users' fundamental rights, comments Julia Reda:
It will be more important than ever for civil society to keep up the pressure in the Member States!
Mid90s is a 2018 USA comedy drama by Jonah Hill. Starring Sunny Suljic, Katherine Waterston and Lucas Hedges.
The movie follows a teenager named Stevie growing up in Los Angeles. He's struggling with his family, including his co-dependent single mom and his abusive older brother, and at school, where his richer friends seem to overlook him.
When Stevie befriends a crew of skateboarders, he learns some tough lessons about class, race, and privilege.
Mid90s was originally given an 18 rated by the Irish Film Classification Office. However the film's distributors successfully appealed the decision and the rating was reduced to 16. Probably a good job as an adults only skateboard movie may have had a
US: MPAA R rated for pervasive language, sexual content, drug and alcohol use, some violent behavior/disturbing images - all involving minors
UK: Passed 15 uncut for strong language, drug misuse, self-harm, violence
The European Parliament is set to vote on legislation that would require websites that host user-generated content to take down material reported as terrorist content within one hour. We have some examples of current notices sent to the Internet Archive
that we think illustrate very well why this requirement would be harmful to the free sharing of information and freedom of speech that the European Union pledges to safeguard.
In the past week, the Internet Archive has received a series of email notices from Europol's European Union Internet Referral Unit (EU IRU) falsely identifying hundreds of URLs on archive.org as terrorist propaganda. At least one of
these mistaken URLs was also identified as terrorist content in a separate take down notice from the French government's L'Office Central de Lutte contre la Criminalit39 li39e aux Technologies de l'Information et de la Communication (OCLCTIC).
The Internet Archive has a few staff members that process takedown notices from law enforcement who operate in the Pacific time zone. Most of the falsely identified URLs mentioned here (including the report from the French government)
were sent to us in the middle of the night 203 between midnight and 3am Pacific 203 and all of the reports were sent outside of the business hours of the Internet Archive.
The one-hour requirement essentially means that we would need to take reported URLs down automatically and do our best to review them after the fact.
It would be bad enough if the mistaken URLs in these examples were for a set of relatively obscure items on our site, but the EU IRU's lists include some of the most visited pages on archive.org and materials that obviously have high
scholarly and research value. See a summary below with specific examples.
A group of some of the best known internet pioneers have written an open letter explaining how the EU's censorship law nominally targeting terrorism will both chill the non terrorist internet whilst simultaneously advantaging US internet giants
over smaller European businesses. The group writes:
EU Terrorist Content regulation will damage the internet in Europe without meaningfully
contributing to the fight against terrorism
Dear MEP Dalton,
Dear MEP Ward,
Dear MEP Reda,
As a group of pioneers, technologists, and innovators who have helped create and sustain todays internet,
we write to you to voice our concern at proposals under consideration in the EU Terrorist Content
Tackling terrorism and the criminal actors who perpetrate it is a necessary public policy objective, and the internet plays an important role in achieving this end. The tragic and harrowing incident in Christchurch, New Zealand
earlier this month has underscored the continued threat terrorism poses to our fundamental freedoms, and the need to confront it in all its forms. However, the fight against terrorism does not preclude lawmakers from their responsibility to implement
evidence-based law that is proportionate, justified, and supportive of its stated aim.
The EU Terrorist Content regulation, if adopted as proposed, will restrict the basic rights of European internet users and undercut innovation on the internet without meaningfully contributing to the fight against terrorism. We
are particularly concerned by the following aspects of the proposed Regulation:
ÂUnclear definition of terrorist content: The definition of 'terrorist content' is extremely broad, and includes no clear exemption for educational, journalistic, or research purposes. This creates the risk of over-removal of lawful
and important public interest speech.
Lack of proportionality: The regulation applies equally to all internet hosting services, bringing thousands of services into scope that have no relevance to terrorist content. By not taking any account of the different types and
sizes of online services, nor their exposure to such illegal content, the new rules would be far out of proportion with the stated aim of the proposal.
Unworkable takedown timeframes: The obligation to remove content within a mere 60 minutes of notification will likely lead to significant over-removal of lawful content and place a catastrophic compliance burden on micro, small, and
medium-sized companies offering services within Europe. At the same time, it will greatly favour large multinational platforms that have already developed highly sophisticated content moderation operations
Reliance on upload filters and other ;proactive measures': The draft regulation frames automated upload filters as ÂetheÂf solution for terrorist content moderation at scale, and provides government agencies with the power to
mandate how such upload filters and other proactive measures are designed and implemented. But upload filtering of 'terrorist content' is fraught with challenges and risks, and only a handful of online services have the resources and capacity to build or
license such technology. As such, the proposal is setting a benchmark that only the largest platforms can meet. Moreover, upload filtering and related proactive measures risks suppressing important public interest content, such as news reports about
terrorist incidents and dispatches from warzones
We fully support efforts to combat dangerous and illegal information on the internet, including through new legislation where appropriate. Yet as currently drafted, this Regulation risks inflicting harm on free expression and due
process, competition and the possibility to innovate online5.
Given these likely ramifications we urge you to undertake a proper assessment of the proposal and make the necessary changes to ensure that the perverse outcomes described above are not realised. At the very least, any legislation of
this nature must include far greater rights protection and be built around a proportionality criterion that ensures companies of all sizes and types can comply and compete in Europe.
Citizens in Europe look to you for leadership in developing progressive policy that protects their rights, ensures their companies can compete, and protects their public interest. This legislation in its current form runs contrary to
those ambitions. We urge you to amend it, for the sake of European citizens and for the sake of the internet. Yours sincerely,
Mitchell Baker Executive Chairwoman, The Mozilla Foundation and Mozilla Corporation Tim Berners-Lee Inventor of the World Wide Web and Founder of the Web Foundation Vint Cerf Internet Pioneer Brewster Kahle Founder & Digital
Librarian, Internet Archive Jimmy Wales Founder of Wikipedia and Member of the Board of Trustees of the Wikimedia Foundation Markus Beckedahl Founder, Netzpolitik; Co-founder, re:publica Brian Behlendorf Member of the EFF Board of Directors; Executive
Director of Hyperledger at the Linux Foundation Cindy Cohn Executive Director, Electronic Frontier Foundation Cory Doctorow Author; Co-Founder of Open Rights Group; Visiting Professor at Open University (UK) Rebecca MacKinnon Co-founder, Global Voices;
Director, Ranking Digital Rights Katherine Maher Chief Executive Officer of the Wikimedia Foundation Bruce Schneier Public-interest technologist; Fellow, Berkman Klein Center for Internet & Society; Lecturer, Harvard Kennedy School