Back in March, Australia shelved plans to extend its copyright safe harbor provisions to services such as Google and Facebook. Now, following consultations with the entertainment industries, the government has revealed it will indeed exclude such
platforms from safe harbour provisions.
Services such as Google, Facebook and YouTube now face massive legal uncertainty as they themselves can be held responsible for copyright infringing posts by users. The logical result would be that the companies will have to check every post
before upload. The vast quantity of posts to check would make this an economically unviable option.
Proposed amendments to the Copyright Act earlier this year would've seen enhanced safe harbor protections for such platforms but they were withdrawn at the eleventh hour due to lobbying by media companies. Such companies accuse platforms like
YouTube of exploiting safe harbor provisions in the US and Europe, which forces copyright holders into an expensive battle to have infringing content taken down.
Communications Minister Mitch Fifield has confirmed the exclusions, so now it is up to Google and Facebook to consider how they can operate under this law.
A federal judge has ruled that EFF need not obey an Australian injunction ordering EFF to take down a Stupid Patent of the Month blog post and never speak of the patent owner's intellectual property again.
It all started when Global Equity Management (SA) Pty Ltd (GEMSA)'s patent was featured as the June 2016 entry in our Stupid Patent of the Month blog series. GEMSA wrote to EFF accusing us of false and malicious slander. It subsequently filed a
lawsuit and obtained an injunction from a South Australia court purporting to require EFF to censor itself. We declined and filed a suit in the U.S. District Court for the Northern District of California seeking a declaration that EFF's post is
The court agreed, finding that the South Australian injunction can't be enforced in the U.S. under a 2010 federal law that took aim against libel tourism, a practice by which plaintiffs--often billionaires, celebrities, or oligarchs--sued U.S.
writers and academics in countries like England where it was easier to win a defamation case. The Securing the Protection of Our Enduring and Established Constitutional Heritage Act ( SPEECH Act ) says foreign orders aren't enforceable in the
United States unless they are consistent with the free speech protections provided by the U.S. and state constitutions, as well as state law.
The court analyzed each of GEMSA's claims for defamation, and found [n]one of these claims could give rise to defamation under U.S. and California law, and accordingly EFF would not have been found liable for defamation under U.S. and California
law. For example, GEMSA's lead complaint was that EFF had called its patent stupid. GEMSA protested that its patent is not in fact stupid but the court found that this was clearly protected opinion. Moreover, the court found that the Australian
court lacked jurisdiction over EFF, and that this constitutes a separate and independent reason that EFF would prevail under the SPEECH Act.
Furthermore, the court found that the Australian order was not enforceable under the SPEECH Act because U.S. and California would provide substantially more First Amendment protection by prohibiting prior restraints on speech in all but the most
extreme circumstances, and providing additional procedural protections in the form of California's anti-SLAPP law.
After its thorough analysis, the court declared (1) that the Australian Injunction is repugnant to the United States Constitution and the laws of California and the Unites States; and (2) that the Australian injunction cannot be recognized or
enforced in the United States.
The decision was a default judgment. GEMSA, which has three pending patent lawsuits in in the Northern District of California, had until May 23 to respond to our case. That day came and went without a word. While GEMSA knows its way around U.S.
courts--having filed dozens of lawsuits against big tech companies claiming patent infringement--it failed to respond to ours.
The European Union is in the process of creating an authority to monitor and censor so-called fake news. It is setting up a High-Level 'Expert' Group. The EU is currently consulting media professionals and the public to decide what powers to
give to this EU body, which is to begin operation next spring.
World Socialist Web Site has its own colourful view on the intentions of the body, but I don't suppose it is too far from the truth:
An examination of the EU's announcement shows that it is preparing mass state censorship aimed not at false information, but at news reports or political views that encourage popular opposition to the European ruling class.
It aims to create conditions where unelected authorities control what people can read or say online.
EU Vice-President Frans Timmermans explained the move in ominous tersm
We live in an era where the flow of information and misinformation has become almost overwhelming. The EU's task is to protect its citizens from fake news and to manage the information they receive.
According to an EU press release, the EU Commission, another unelected body, will select the High-Level Expert Group, which is to start in January 2018 and will work over several months. It will discuss possible future actions to strengthen
citizens' access to reliable and verified information and prevent the spread of disinformation online.
Who will decide what views are verified, who is reliable and whose views are disinformation to be deleted from Facebook or removed from Google search results? The EU, of course.
Three countries are using the European Council to put dangerous pro-censorship amendments into the already controversial Copyright Directive.
The copyright law that Openmedia has been campaigning on -- the one pushing the link tax and censorship machines -- is facing some dangerous sabotage from the European Council. In particular, France, Spain and Portugal are directly harming the
The Bill is currently being debated in the European Parliament but the European Council also gets to make its own proposed version of the law, and the two versions eventually have to compromise with each other. This European Council is made up of
ministers from the governments of all EU member states. Those ministers are usually represented by staff who do most of the negotiating on their behalf. It is not a transparent body, but it does have a lot of power.
The Council can choose to agree with Parliament's amendments, but it doesn't look like that's going to happen in this case. In fact they've been taking worrying steps, particularly when it comes to the censorship machine proposals.
As the proposal stands before the Council intervention, it encourages sites where users upload and make content to install filtering mechanisms -- a kind of censorship machine which would use algorithms to look for copyrighted content and then
block the post. This is despite the fact that there many legal reasons to use copyrighted content.
These new changes want to go a step further. They firstly want to make the censorship machine demand even more explicit. As Julia Reda puts it:
They want to add to the Commission proposal that platforms need to automatically remove media that has once been classified as infringing, regardless of the context in which it is uploaded.
Then, they go all in with a suggested rewrite of existing copyright law to end the liability protections which are vital for a functioning web.
Liability protection laws mean we (not websites) are responsible for what we say and post online. This is so that websites are not obliged to monitor everything we say or do. If they were liable there would be much overzealous blocking and
censorship. These rules made YouTube, podcast platforms, social media, all possible. The web as we know it works because of these rules.
But the governments of France, Spain, Portugal and the Estonian President of the Council want to undo them. It would mean all these sites could be sued for any infringement posted there. It would put off new sites from developing. And it
would cause huge legal confusion -- given that the exact opposite is laid out in a different EU law.
A Spanish judge has dealt a blow to copyright trolls in Spain. In a first of its kind ruling, the court dismissed an example legal case due to a lack of evidence.
The Commercial Court of Donostia dismissed the claim against an alleged file-sharer due to a lack of evidence. Copyright company Dallas Buyers Club identified the infringer through an IP-address, but according to Judge Pedro José Malagón Ruiz,
this is not good enough.
The ruling says that there is no way to know whether the defendant was the P2P user or not, because an IP address only identifies the person who subscribed to the Internet connection, not the user who made use of the connection at a certain
moment, copyright lawyer David Bravo tells TorrentFreak.
A relative or a guest could have been using the network, or even someone accessing the wifi if it was open, he adds.
In addition, the Judge agreed with the defense that there is no evidence that the defendant actively made the movie available. This generally requires a form of intent. However, BitTorrent clients automatically share files with others, whether
it's the intention of the user or not.
In other words, these BitTorrent transfers are not necessarily an act of public communication, therefore, they are not infringing any copyrights.
Article 13: Monitoring and filtering of internet content is unacceptable. Index on Censorship joined with 56 other NGOs to call for the deletion of Article 13 from the proposal on the Digital Single Market, which includes obligations on internet
companies that would be impossible to respect without the imposition of excessive restrictions on citizens' fundamental rights.
Dear President Juncker,
Dear President Tajani,
Dear President Tusk,
Dear Prime Minister Ratas,
Dear Prime Minister Borissov,
Dear MEP Voss, MEP Boni
The undersigned stakeholders represent fundamental rights organisations.
Fundamental rights, justice and the rule of law are intrinsically linked and constitute core values on which the EU is founded. Any attempt to disregard these values undermines the mutual trust between member states required for the EU to
function. Any such attempt would also undermine the commitments made by the European Union and national governments to their citizens.
Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens' fundamental rights.
Article 13 introduces new obligations on internet service providers that share and store user-generated content, such as video or photo-sharing platforms or even creative writing websites, including obligations to filter uploads to their
services. Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens' communications if they are to have any chance of staying in business.
Article 13 contradicts existing rules and the case law of the Court of Justice. The Directive of Electronic Commerce ( 2000/31/EC) regulates the liability for those internet companies that host content on behalf of their users. According
to the existing rules, there is an obligation to remove any content that breaches copyright rules, once this has been notified to the provider.
Article 13 would force these companies to actively monitor their users' content, which contradicts the 'no general obligation to monitor' rules in the Electronic Commerce Directive. The requirement to install a system for filtering electronic
communications has twice been rejected by the Court of Justice, in the cases Scarlet Extended ( C 70/10) and Netlog/Sabam (C 360/10). Therefore, a legislative provision that requires internet companies to install a filtering system would
almost certainly be rejected by the Court of Justice because it would contravene the requirement that a fair balance be struck between the right to intellectual property on the one hand, and the freedom to conduct business and the right to
freedom of expression, such as to receive or impart information, on the other.
In particular, the requirement to filter content in this way would violate the freedom of expression set out in Article 11 of the Charter of Fundamental Rights. If internet companies are required to apply filtering mechanisms in order to
avoid possible liability, they will. This will lead to excessive filtering and deletion of content and limit the freedom to impart information on the one hand, and the freedom to receive information on the other.
If EU legislation conflicts with the Charter of Fundamental Rights, national constitutional courts are likely to be tempted to disapply it and we can expect such a rule to be annulled by the Court of Justice. This is what happened with the
Data Retention Directive (2006/24/EC), when EU legislators ignored compatibility problems with the Charter of Fundamental Rights. In 2014, the Court of Justice declared the Data Retention Directive invalid because it violated the Charter.
Taking into consideration these arguments, we ask the relevant policy-makers to delete Article 13.
European Digital Rights (EDRi)
Associação D3 -- Defesa dos Direitos Digitais
Associação Nacional para o Software Livre (ANSOL)
Association for Progressive Communications (APC)
Association for Technology and Internet (ApTI)
Association of the Defence of Human Rights in Romania (APADOR)
Bangladesh NGOs Network for Radio and Communication (BNNRC)
Bits of Freedom (BoF)
Bulgarian Helsinki Committee
Center for Democracy & Technology (CDT)
Centre for Peace Studies
Coalizione Italiana Liberta@ e Diritti Civili (CILD)
Code for Croatia
Culture Action Europe
Electronic Frontier Foundation (EFF)
Estonian Human Rights Centre
Freedom of the Press Foundation
Frënn vun der Ënn
Helsinki Foundation for Human Rights
Hermes Center for Transparency and Digital Human Rights
Human Rights Monitoring Institute
Human Rights Watch
Human Rights Without Frontiers
Hungarian Civil Liberties Union
Index on Censorship
International Partnership for Human Rights (IPHR)
International Service for Human Rights (ISHR)
Justice & Peace
La Quadrature du Net
Media Development Centre
Miklos Haraszti (Former OSCE Media Representative)
Modern Poland Foundation
Netherlands Helsinki Committee
One World Platform
Open Observatory of Network Interference (OONI)
Open Rights Group (ORG)
Plataforma en Defensa de la Libertad de Información (PDLI)
Reporters without Borders (RSF)
Rights International Spain
South East Europe Media Organisation (SEEMO)
South East European Network for Professionalization of Media (SEENPM)
The Right to Know Coalition of Nova Scotia (RTKNS)
At EFF, we see
endless attempts to misuse copyright law in order to silence content that a person dislikes. Copyright law is sadly less protective of speech than other speech regulations like defamation, so plaintiffs are motivated to find ways to turn
many kinds of disputes into issues of copyright law. Yesterday, a federal appeals court rejected one such ploy: an attempt to use copyright to get rid of a negative review.
The website Ripoff Report hosts criticism of a variety of professionals and companies, who doubtless would prefer that those critiques not exist. In order to protect platforms for speech like Ripoff Report, federal law sets a very high bar for
private litigants to collect damages or obtain censorship orders against them. The gaping exception to this protection is intellectual property claims, including copyright, for which a lesser protection applies.
One aggrieved professional named Goren (and his company) went to court to get a negative review taken down from Ripoff Report. If Goren had relied on a defamation claim alone, the strong protection of CDA 230 would protect Ripoff Report. But
Goren sought to circumvent that protection by getting a court order seizing ownership of the copyright from its author for himself, then suing Ripoff Report's owner for copyright infringement. We
filed a brief explaining several reasons why his claims should fail, and urging the court to prevent the use of copyright as a pretense for suppressing speech.
Fortunately, the Court of Appeals for the First Circuit agreed that Ripoff Report is not liable. It ruled on a narrow basis, pointing out that the person who originally posted the review on Ripoff Report gave the site's owners irrevocable
permission to host that content. Therefore, continuing to host it could not be an infringement, even if Goren did own the copyright.
Goren paid the price for his improper assertion of copyright here: the appeals court upheld an award of over $100,000 in attorneys' fees. The award of fees in a case like this is important both because it deters improper assertion of copyright,
and because it helps compensate defendants who choose to litigate rather than settling for nuisance value simply to avoid the expense of defending their rights.
We're glad the First Circuit acted to limit the ways that private entities can censor speech online.
The EU is considering forcing websites to vet uploaded content for pirated material. Of course only the media giants have the capability to do this and so the smaller players would be killed (probably as intended)
If you've been following the slow progress of the European Commission's proposal to introduce new
upload filtering mandates for Internet platforms , or its equally misguided plans to
impose a new link tax on those who publish snippets from news stories, you should know that the end game is close at hand. The LIBE (Civil Liberties) Committee is the last committee of the European Parliament that is due to vote on its
opinion on the so-called "Digital Single Market" proposals this Thursday October 5, before the proposals return to their home committee of the Parliament (the JURI or Legal Affairs Committee) for the preparation of a final draft.
The Confused Thinking Behind the Upload Filtering Mandate
The Commission's rationale for the upload filtering mandate seems to be that in order to address unwelcome behavior online (in this case, copyright infringement), you have to not only make that behavior illegal, but you also have to make it impossible
. The same rationale also underpins other similar notice and stay-down schemes, such as one
that already exists in Italy ; they are meant to stop would-be copyright infringement in its tracks by preventing presumptively-infringing material from being uploaded to begin with, thereby preventing it from being downloaded by anyone
But this kind of prior restraint on speech or behavior isn't commonly applied to citizens in any other area of their lives. You car isn't speed-limited so that it's impossible for you to exceed the speed limit. Neither does your telephone contain
a bugging device that makes it impossible for you to slander your neighbor. Why is copyright treated so differently, that it requires not only that actual infringements be dealt with (Europe's existing DMCA-like
notice and takedown system already provides for this), but that predicted future infringements also be prevented?
More importantly, what about the rights of those whose uploaded content is flagged as being copyright-infringing, when it really isn't? The European Commission's own research, in a commissioned report that they
attempted to bury , suggests that the harm to copyright holders from copyright infringement is much less than has been often assumed. At the very least, this has to give us pause before adopting new extreme copyright enforcement measures
that will impact users' human rights.
Even leaving aside the human impact of the upload filter, European policymakers should also be concerned about the impact of the mandate on small businesses and startups. A market-leading tool required to implement upload filtering just for
audio files would cost a medium-sized file hosting company
between $10,000 to $25,000 per month in license fees alone. In the name of copyright enforcement, European policymakers would give a market advantage to entrenched large companies at the expense of smaller local companies and startups.
The Link Tax Proposal is Also Confused
The link tax proposal is also based on a false premise. But if you are expecting some kind of doctrinally sound legal argument for why a new link-tax ought to inhere in news publishers, you will be sorely disappointed. Purely and simply, the
proposal is founded on the premise that because news organizations are struggling to maintain their revenues in the post-millennial digital media space, and because Internet platforms are doing comparatively better, it is politically expedient
that the latter industry be made to subsidize the former. There's nothing more coherent behind this proposal than that kind of base realpolitik.
But the proposal doesn't even work on that level. In fact, we agree that news publishers are struggling. We just don't think that taxing those who publish snippets of news articles will do anything to help them. Indeed, the fact that
small news publishers have rejected the link tax proposal , and that previous implementations of the link tax in Spain and Germany were
dismal failures , tells you all that you need to know about whether taxing links would really be good for journalism.
So as these two misguided and harmful proposals make their way through the LIBE committee this week, it's time to call an end to this nonsense. Digital rights group OpenMedia has launched a click-to-call tool that you can use, available in
Spanish , and
Polish . If you're a European citizen, the tool will call your representative on the LIBE committee, and if you don't have an MEP, it calls the committee chair, Claude Moraes. As the counter clicks closer to midnight on these regressive and
cynical copyright measures, it's more important than ever for individual users like you to be heard.
11th October 2017. From OpenMedia
With only 48 hours notice we received word that the vote had been delayed. Why? The content censorship measures have become so controversial that MEPs decided that they needed more work to improve them, before they would be ready to go vote.
There's never been a better time to call your MEP about these rules. This week they are back in their offices and ready to start thinking with a fresh head. The delay means we have even more time to say no to content censorship, and no to the
Link Tax. With so many people speaking up, it's clear our opponents are rattled. Now we must keep up the pressure.