Against all the odds, but with the support of nearly a million Europeans
, MEPs voted earlier this month to reject
the EU's proposed copyright reform--including controversial proposals to create a new "snippet" right for news publishers, and mandatory copyright filters for sites that published user uploaded content.
The change was testimony to how powerful and fast-moving Net activists can be. Four weeks ago, few knew that these crazy provisions were even being considered. By the June 20th vote,
Internet experts were weighing in
, and wider conversations
were starting on sites like Reddit.
The result was a vote on July 5th of all MEPS, which ended in a 318 against 278 victory in favour of withdrawing the Parliament's support for the languages. Now all MEPs will have a chance in September to submit new amendments and vote on a final
text -- or reject the directive entirely.
While re-opening the text was a surprising set-back for Article 13 and 11, the battle isn't over: the language to be discussed on in September will be based on
the original proposal
by the European Commission, from two years ago -- which included the first versions of the copyright filters, and snippet rights. German MEP Axel Voss's controversial modifications will also be included in the debate, and there may well be a flood
of other proposals, good and bad, from the rest of the European Parliament.
There's still sizeable support for the original text: Article 11 and 13's loudest proponents, led by Voss, persuaded many MEPs to support them by arguing that these new powers would restore the balance between American tech giants and Europe's
newspaper and creative industries -- or "close the value gap", as their arguments have it.
But using mandatory algorithmic censors and new intellectual property rights to restore balance is like Darth Vader bringing balance to the Force: the fight may involve a handful of brawling big players, but it's everybody else who would have to
deal with the painful consequences.
That's why it remains so vital for MEPs to hear voices that represent the wider public interest. Librarians
, and redditors, everyone from small Internet businesses and celebrity Youtubers, spoke up in a way that was impossible for the Parliament to ignore. The same Net-savvy MEPs and activists that wrote and fought for the GDPR put their names to
challenge the idea that these laws would rein back American tech companies. Wikipedians stood up and were counted: seven independent, European-language encyclopedias consensed to
on the day of the vote. European alternatives to Google, Facebook and Twitter argued that this would set back their cause
. And European artists
spoke up that the EU shouldn't be setting up censorship and ridiculous link rights in their name.
To make sure the right amendments pass in September, we need to keep that conversation going. Read on to find out what you can do, and who you should be speaking to.
Who Spoke Up In The European Parliament?
As we noted last week, the decision to challenge the JURI committee's language on Article 13 and 11 last week was not automatic -- a minimum of 78 MEPs needed to petition for it to be put to the vote. Here's
of those MEPs who actively stepped forward to stop the bill. Also heavily involved was Julia Reda, the Pirate Party MEP who worked so hard on making the rest of the proposed directive so positive for copyright reform, and then re-dedicated herself
to stopping the worst excesses of the JURI language, and Marietje Schaake
, the Parliament's foremost advocate for human rights online.
These are the core of the opposition to Article 13 and 11. A look at that list, and the
final list of votes
on July 5th, shows that the proposals have opponents in every corner of Europe's political map. It also shows that every MEP who voted for Article 13 and 11, has someone close to them politically who knows why it's wrong.
What happens now?
In the next few weeks, those deep in the minutiae of the Copyright directive will be crafting amendments for MEPs to vote on in September. The tentative schedule is that the amendments are accepted until Wednesday September 5th, with a vote at
12:00 Central European Time on Wednesday September 12th.
The European Parliament has a fine tradition of producing a rich supply of amendments (the GDPR had thousands). We'll need to coalesce support around a few key fixes that will keep the directive free of censorship filters and snippet rights
language, and replace them with something less harmful to the wider Net.
Julia Reda already proposed amendments. And one of Voss' strongest critics in the latest vote was Catherine Stihler, the Scottish MEP who had created and passed consumer-friendly directive language in her committee, which Voss ignored. (Here's her
before the final vote.)
While we wait for those amendments to appear, the next step is to keep the pressure on MEPs to remember what's at stake -- no mandatory copyright filters, and no new ancillary rights on snippets of text.
In particular, if you talk to your MEP
, it's important to convey how you feel these proposals will affect you . MEPs are hearing from giant tech and media companies. But they are only just beginning to hear from a broader camp: the people of the Internet.
As we have been covering in the last couple of articles, a controversial EU Copyright Directive has been under discussion at
the European Parliament, and in a surprising turn of events, it voted to reject
fast-tracking the tabled proposal by the JURI Committee which contained controversial proposals, particularly in
and Art 13
. The proposed Directive will now get a full discussion and debate in plenary in September.
I say surprising because for those of us who have been witnesses (and participants) to the Copyright Wars for the last 20 years, such a defeat of copyright maximalist proposals is practically unprecedented, perhaps with the exception of
. For years we've had a familiar pattern in the passing of copyright legislation: a proposal has been made to enhance protection and/or restrict liberties, a small group of ageing millionaire musicians would be paraded supporting the changes in
the interest of creators. Only copyright nerds and a few NGOs and digital rights advocates would complain, their opinions would be ignored and the legislation would pass unopposed. Rinse and repeat.
But something has changed, and a wide coalition has managed to defeat powerful media lobbies for the first time in Europe, at least for now. How was this possible?
The main change is that the media landscape is very different thanks to the Internet. In the past, the creative industries were monolithic in their support for stronger protection, and they included creators, corporations, collecting societies,
publishers, and distributors; in other words the gatekeepers and the owners were roughly on the same side. But the Internet brought a number of new players, the tech industry and their online platforms and tools became the new gatekeepers.
Moreover, as people do not buy physical copies of their media and the entire industry has moved towards streaming, online distributors have become more powerful. This has created a perceived imbalance, where the formerly dominating industries need
to negotiate with the new gatekeepers for access to users. This is why creators complain about a value gap
between what they perceive they should be getting, and what they actually receive from the giants.
The main result of this change from a political standpoint is that now we have two lobbying sides in the debate, which makes all the difference when it comes to this type of legislation. In the past, policymakers could ignore experts and digital
rights advocates because they never had the potential to reach them, letters and articles by academics were not taken into account, or given lip service during some obscure committee discussion just to be hidden away. Tech giants such as Google
have provided lobbying access in Brussels, which has at least levelled the playing field when it comes to presenting evidence to legislators.
As a veteran of the Copyright Wars, I have to admit that it has been very entertaining reading the reaction from the copyright industry lobby groups and their individual representatives, some almost going apoplectic with rage at Google's
intervention. These tend to be the same people who spent decades lobbying legislators to get their way unopposed, representing large corporate interests unashamedly and passing laws that would benefit only a few, usually to the detriment of users.
It seems like lobbying must be decried when you lose.
But to see this as a victory for Google and other tech giants completely ignores the large coalition that shares the view that the proposed Articles 11 and 13 are very badly thought-out, and could represent a real danger to existing rights. Some
of us have been fighting this fight when Google did not even exist, or it was but a small competitor of AltaVista, Lycos, Excite and Yahoo!
At the same time that more restrictive copyright legislation came into place, we also saw the rise of free and open source software, open access, Creative Commons and open data. All of these are legal hacks that allow sharing, remixing and
openness. These were created precisely to respond to restrictive copyright practices. I also remember how they were opposed as existential threats by the same copyright industries, and treated with disdain and animosity. But something wonderful
happened, eventually open source software started winning (we used to buy operating systems), and Creative Commons became an important part of the Internet's ecosystem by propping-up valuable common spaces such as Wikipedia.
Similarly, the Internet has allowed a great diversity of actors to emerge. Independent creators, small and medium enterprises, online publishers and startups love the Internet because it gives them access to a wider audience, and often they can
bypass established gatekeepers. Lost in this idiotic "Google v musicians" rhetoric has been the threat that both Art 11 and 13 represent to small entities. Art 11 proposes a new publishing right that has been proven to affect smaller
players in Germany and Spain; while Art 13 would impose potentially crippling economic restrictions to smaller companies as they would have to put in place automated filtering systems AND redress mechanisms against mistakes. In fact, it has been
often remarked that Art 13 would benefit existing dominant forces, as they already have filtering in place (think ContentID).
Similarly, Internet advocates and luminaries see the proposals as a threat to the Internet, the people who know the Web best think that this is a bad idea. If you can stomach it,
read this thread featuring
a copyright lobbyist attacking Neil Gaiman, who has been one of the Internet celebrities that have voiced their concerns about the Directive.
Even copyright experts
who almost never intervene in digital rights affairs the have been vocal in their opposition to the changes.
And finally we have political representatives from various parties and backgrounds who have been vocally opposed to the changes. While the leader of the political opposition has been the amazing Julia Reda, she has managed to bring together a
variety of voices from other parties and countries. The vitriol launched at her has been unrelenting, but futile. It has been quite a sight to see her opponents both try to dismiss her as just another clueless young Pirate commanded by Google,
while at the same time they try to portray her as a powerful enemy in charge of the mindless and uninformed online troll masses ready to do her bidding.
All of the above managed to do something wonderful, which was to convey the threat in easy-to-understand terms so that users could contact their representatives and make their voice heard. The level of popular opposition to the Directive has been
a great sight to behold.
Tech giants did not create this alliance, they just gave various voices access to the table. To dismiss this as Google's doing completely ignores the very real and rich tapestry of those defending digital rights, and it is quite clearly
patronising and insulting, and precisely the reason why they lost. It was very late until they finally realised that they were losing the debate with the public, and not even the last-minute deployment of musical dinosaurs could save the day.
But the fight continues, keep contacting your MEPs and keep applying pressure.
So who supported internet censorship in the EU parliamentary vote?
Mostly the EU Conservative Group and also half the Social Democrat MEPs and half the Far Right MEPs
Litigation can always take twists and turns, but when EFF filed a lawsuit against Universal Music Group in 2007 on behalf of Stephanie Lenz, few would have anticipated it would be ten years until the case was finally resolved. But
, at last, it is. Along the way, Lenz v. Universal contributed to strengthening fair use law, bringing nationwide attention to the issues of copyright and fair use in new digital movie-making and sharing technologies.
It all started when Lenz posted a YouTube video
of her then-toddler-aged son dancing while Prince's song Let's Go Crazy played in the background, and Universal used copyright claims to get the link disabled. We brought the
hoping to get some clarity from the courts on a simple but important issue: can a rightsholder use the Digital Millennium Copyright Act to take down an obvious fair use, without consequence?
Congress designed the DMCA
to give rightsholders, service providers, and users relatively precise rules of the road for policing online copyright infringement. The center of the scheme is the notice and takedown process. In exchange for substantial protection from liability
for the actions of their users, service providers must promptly take offline content on their platforms that has been identified as infringing, as well as several other prescribed steps. Copyright owners, for their part, are given an expedited,
extra-judicial procedure for obtaining redress against alleged infringement, paired with explicit statutory guidance regarding the process for doing so, and provisions designed to deter and ameliorate abuse of that process.
Without Section 512, the risk of crippling liability for the acts of users would have prevented the emergence of most of the social media outlets we use today. Instead, the Internet has become the most revolutionary platform for the creation and
dissemination of speech that the world has ever known.
But Congress also knew that Section 512's powerful incentives could result also in lawful material being censored from the Internet, without prior judicial scrutiny--much less advance notice to the person who posted the material--or an opportunity
to contest the removal. To inhibit abuse, Congress made sure that the DMCA included a series of checks and balances, including Section 512(f), which gives users the ability to hold rightsholders accountable if they send a DMCA notice in bad faith.
In this case, Universal Music Group claimed to have a good faith belief that Ms. Lenz's video of her child dancing to a short segment of barely-audible music infringed copyright. Yet the undisputed facts showed Universal never considered whether
Ms. Lenz's use was lawful under the fair use doctrine. If it had done so, it could not reasonably have concluded her use was infringing. On behalf of Stephanie Lenz, EFF argued that this was a misrepresentation in violation of Section 512(f).
In response, Universal argued that rightsholders have no obligation to consider fair use at all. The U.S. Court of Appeals for the Ninth Circuit rejected
that argument, correctly holding that the DMCA requires a rightsholder to consider whether the uses she targets in a DMCA notice are actually lawful under the fair use doctrine. However, the court also held that a rightsholder's determination on
that question passes muster as long as she subjectively believes it to be true. This leads to a virtually incoherent result: a rightsholder must consider fair use, but has no incentive to actually learn what such a consideration should entail.
After all, if she doesn't know what the fair use factors are, she can't be held liable for not applying them thoughtfully.
We were disappointed in that part of the ruling, but it came with a big silver lining: the court also held that fair use is not simply a narrow defense copyright but an affirmative public right. For decades, rightsholders and scholars had
debated the issue, with many preferring to construe fair use as narrowly as possible. Thanks to the Lenz decision, courts will be more likely to think of fair use, correctly, as a crucial vehicle for achieving the real purpose of copyright
law: to promote the public interest in creativity and innovation. And rightsholders are on notice: they must at least consider fair use before sending a takedown notice.
Lenz and Universal filed petitions requesting
that the Supreme Court review the Ninth Circuit's ruling. The Supreme Court denied both petitions. This meant that the case returned to the district court for trial on the question of whether Universal's takedown was a misrepresentation under the
Ninth Circuit's subjective standard. Rather than go to trial, the parties have agreed to a settlement.
Lenz v. Universal helped make some great law on fair use and also played a role in leading to better takedown processes at Universal. EFF congratulates Stephanie Lenz for fighting the good fight, and we thank our co-counsel at
Keker, Van Nest & Peters LLP
and Kwun Bhansali Lazarus LLP
for being our partners through this long journey.
The European Parliament's Committee on Legal Affairs (JURI) has officially approved Articles 11 and 13 of a Digital Single
Market (DSM) copyright proposal, mandating censorship machines and a link tax.
Articles 11 and 13 of the Directive of the European Parliament and of the Council on Copyright in the Digital Single Market have been the subject of considerable campaigning from pro-copyleft groups including the Open Rights Group and Electronic
Frontier Foundation of late.
Article 11, as per the final version of the proposal, discusses the implementation of a link tax - the requirement that any site citing third-party materials do so in a way that adheres to the exemptions and restrictions of a total of 28 separate
copyright laws or pays for a licence to use and link to the material;
Article 13, meanwhile, requires any site which allows users to post text, sound, program code, still or moving images, or any other work which can be copyrighted to automatically scan all such uploads against a database of copyright works - a
database which they will be required to pay to access.
Both Article 11 and Article 13 won't become official legislation until passed by the entire European Parliament in a plenary vote. There's no definite timetable for when such a vote might take place, but it would likely happen sometime between
December of this year and the first half of 2019.
On June 20, the EU's legislative committee will vote on the
new Copyright directive
, and decide whether it will include the controversial "Article 13" (automated censorship of anything an algorithm identifies as a copyright violation) and "Article 11" (no linking to news stories without paid permission from
These proposals will make starting new internet companies effectively impossible -- Google, Facebook, Twitter, Apple, and the other US giants will be able to negotiate favourable rates and build out the infrastructure to comply with these
proposals, but no one else will. The EU's regional tech success stories -- say Seznam.cz
, a successful Czech search competitor to Google -- don't have $60-100,000,000 lying around to build out their filters, and lack the leverage to extract favorable linking licenses from news sites.
If Articles 11 and 13 pass, American companies will be in charge of Europe's conversations, deciding which photos and tweets and videos can be seen by the public, and who may speak.
So far, the focus in the debate has been on the intended consequences of the proposals: the idea that a certain amount of free expression and competition must be sacrificed to enable rightsholders to force Google and Facebook to share their
But the unintended -- and utterly foreseeable -- consequences are even more important. Article 11's link tax allows news sites to decide who gets to link to them, meaning that they can exclude their critics. With election cycles dominated by
hoaxes and fake news, the right of a news publisher to decide who gets to criticise it is carte blanche to lie and spin.
Article 13's copyright filters are even more vulnerable to attack: the proposals contain no penalties for false claims of copyright ownership, but they do mandate that the filters must accept copyright claims in bulk, allowing
rightsholders to upload millions of works at once in order to claim their copyright and prevent anyone from posting them.
That opens the doors to all kinds of attacks. The obvious one is that trolls might sow mischief by uploading millions of works they don't hold the copyright to, in order to prevent others from quoting them: the works of Shakespeare, say, or
everything ever posted to Wikipedia, or my novels, or your family photos.
More insidious is the possibility of targeted strikes during crisis: stock-market manipulators could use bots to claim copyright over news about a company, suppressing its sharing on social media; political actors could suppress key articles
during referendums or elections; corrupt governments could use arms-length trolls to falsely claim ownership of footage of human rights abuses.
It's asymmetric warfare: falsely claiming a copyright will be easy (because the rightsholders who want this system will not tolerate jumping through hoops to make their claims) and instant (because rightsholders won't tolerate delays when their
new releases are being shared online at their moment of peak popularity). Removing a false claim of copyright will require that a human at an internet giant looks at it, sleuths out the truth of the ownership of the work, and adjusts the database
-- for millions of works at once. Bots will be able to pollute the copyright databases much faster than humans could possibly clear it.
I spoke with Wired UK's KG Orphanides about this, and their excellent article
on the proposal is the best explanation I've seen of the uses of these copyright filters to create unstoppable disinformation campaigns.
Doctorow highlighted the potential for unanticipated abuse of any automated copyright filtering system to make false copyright claims, engage in targeted harassment and even silence public discourse at sensitive times.
"Because the directive does not provide penalties for abuse -- and because rightsholders will not tolerate delays between claiming copyright over a work and suppressing its public display -- it will be trivial to claim copyright over key
works at key moments or use bots to claim copyrights on whole corpuses.
The nature of automated systems, particularly if powerful rightsholders insist that they default to initially blocking potentially copyrighted material and then releasing it if a complaint is made, would make it easy for griefers to use copyright
claims over, for example, relevant Wikipedia articles on the eve of a Greek debt-default referendum or, more generally, public domain content such as the entirety of Wikipedia or the complete works of Shakespeare.
"Making these claims will be MUCH easier than sorting them out -- bots can use cloud providers all over the world to file claims, while companies like Automattic (WordPress) or Twitter, or even projects like Wikipedia, would have to marshall
vast armies to sort through the claims and remove the bad ones -- and if they get it wrong and remove a legit copyright claim, they face unbelievable copyright liability."
Politicians, about to vote in favor of mandatory upload filtering in Europe, get channel deleted by
YouTube's upload filtering.
French politicians of the former Front National are furious: their entire YouTube channel was just taken down by automatic filters at YouTube for alleged copyright violations. Perhaps this will cause them to reconsider next week's vote, which they
have announced they will support: the bill that will make exactly this arbitrary, political, and unilateral upload filtering mandatory all across Europe.
The French party Front National, now renamed Rassemblemant National (national rally point), which is one of biggest parties in France, have gotten their YouTube channel disappeared on grounds of alleged copyright violations. In an interview with
French Europe1, their leader Marine Le Pen calls the takedown arbitrary, political, and unilateral.
Europe is about to vote on new copyright law next week. Next Wednesday or Thursday. So let's disregard here for a moment that this happened to a party normally described as far-right, and observe that if it can happen to one of France's biggest
parties regardless of their policies, then it can happen to anyone for political reasons 204 or any other reason.
The broadcast named TVLibert39s is gone, described by YouTube as YouTube has blocked the broadcast of the newscast of Thursday, June 14 for copyright infringement.
Marine Le Pen was quoted as saying, This measure is completely false; we can easily assert a right of quotation [to illustrate why the material was well within the law to broadcast].
She's right. Automated upload filters do not take into account when you have a legal right to broadcast copyrighted material for one of the myriad of valid reasons. They will just assume that this such reasons never exist; if nothing else, to make
sure that the hosting platform steers clear of any liability. Political messages will be disappeared on mere allegations by a political opponent, just as might have happened here.
And yet, the Rassemblemant National is going to vote in favor of exactly this mandatory upload filtering. The horror they just described on national TV as arbitrary, political, and unilateral.
It's hard to illustrate clearer that Europe's politicians have absolutely no idea about the monster they're voting on next week.
The decisions to come will be unilateral, political, and arbitrary. Freedom of speech will be unilateral, political, and arbitrary. Just as Marine Le Pen says. Just as YouTube's Content ID filtering is today, as has just been illustrated.
The article mandating this unilateral, political, and arbitrary censorship is called Article 13 of the upcoming European Copyright bill, and it must be removed entirely. There is no fixing of automated censorship machines.
Privacy remains your own responsibility. So do your freedoms of speech, information, and expression.
David Kaye, the UN's Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the
Directive -- the part about mandatory copyright filters -- would be a disaster for free speech and would violate the UN's Declaration on Human Rights, and in particular Article 19 which says:
Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media regardless of frontiers.
As Kaye's report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.
Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed versions of Article 13 do
not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating best efforts and taking effective and
proportionate measures. Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave
considerable leeway for interpretation.
The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression should be provided by law. Such uncertainty would also raise
pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of upload. I am concerned that the restriction of
user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality
that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.
Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing algorithms at the problem -- especially when a
website may face legal liability for getting it wrong.
The designation of such mechanisms as the main avenue to address users' complaints effectively delegates content blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under
international human rights law. The blocking of content -- particularly in the context of fair use and other fact-sensitive exceptions to copyright -- may raise complex legal questions that require adjudication by an independent and impartial
judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and expedited judicial process are available as less invasive means for protecting the aims of copyright law.
In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I
am concerned that such delegation would violate the State's obligation to provide access to an effective remedy for violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are
profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to
designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer route of claiming that the upload was a terms of service violation -- this outcome may deprive users of even the
remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content restrictions, may often fail to address financial and other harms associated with the blocking of
He goes on to point that while large platforms may be able to deal with all of this, smaller ones are going to be in serious trouble:
I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an online content sharing provider under Article 2(5) is based on ambiguous and highly subjective criteria
such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial resources to establish licensing agreements with
media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although Article 13(5)'s criteria for effective
and proportionate measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that nonprofits and small providers
face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could be an impediment to the
right to science and culture as framed in Article 15 of the ICESCR.