The EU is considering a copyright proposal that would require code-sharing platforms to monitor all content that users upload for
potential copyright infringement (see the EU Commission's proposed Article 13 of the Copyright Directive ). The proposal is aimed at music and videos on streaming platforms, based on a theory of a "value gap" between the profits those
platforms make from uploaded works and what copyright holders of some uploaded works receive. However, the way it's written captures many other types of content, including code.
We'd like to make sure developers in the EU who understand that automated filtering of code would make software less reliable and more expensive--and can explain this to EU policymakers--participate in the conversation.
Why you should care about upload filters
Upload filters (" censorship machines ") are one of the most controversial elements of the copyright proposal, raising a number of concerns, including:
Privacy : Upload filters are a form of surveillance, effectively a "general monitoring obligation" prohibited by EU law
Free speech : Requiring platforms to monitor content contradicts intermediary liability protections in EU law and creates incentives to remove content
Ineffectiveness : Content detection tools are flawed (generate false positives, don't fit all kinds of content) and overly burdensome, especially for small and medium-sized businesses that might not be able to afford
them or the resulting litigation
Upload filters are especially concerning for software developers given that:
Software developers create copyrightable works--their code--and those who choose an open source license want to allow that code to be shared
False positives (and negatives) are especially likely for software code because code often has many contributors and layers, often with different licensing for different components
Requiring code-hosting platforms to scan and automatically remove content could drastically impact software developers when their dependencies are removed due to false positives
A German law requiring social media companies like Facebook and Twitter to remove reported hate speech without
enough time to consider the merits of the report is set to be revised following criticism that too much online content is being blocked.
The law, called NetzDG for short, is an international test case and how it plays out is being closely watched by other countries considering similar measures.
German politicians forming a new government told Reuters they want to add an amendment to help web users get incorrectly deleted material restored online.
The lawmakers are also pushing for social media firms to set up an independent body to review and respond to reports of offensive content from the public, rather than leaving to the social media companies who by definition care more about profits
than supporting free speech.
Such a system, similar to how video games are policed in Germany, could allow a more considered approach to complex decisions about whether to block content, legal experts say.
Facebook, which says it has 1,200 people in Germany working on reviewing posts out of 14,000 globally responsible for moderating content and account security, said it was not pursuing a strategy to delete more than necessary. Richard Allan,
Facebook's vice president for EMEA public policy said:
People think deleting illegal content is easy but it's not. Facebook reviews every NetzDG report carefully and with legal expertise, where appropriate. When our legal experts advise us, we follow their assessment so we can meet our obligations
under the law.
Johannes Ferchner, spokesman on justice and consumer protection for the Social Democrats and one of the architects of the law said:
We will add a provision so that users have a legal possibility to have unjustly deleted content restored.
Thomas Jarzombek, a Christian Democrat who helped refine the law, said the separate body to review complaints should be established, adding that social media companies were deleting too much online content. NetzDG already allows for such a
self-regulatory body, but companies have chosen to go their own way instead. According to the coalition agreement, both parties want to develop the law to encourage the establishment of such a body.
Last week, the European Parliament's MEP in charge of overhauling the EU's copyright laws did a U-turn on his predecessor's
position. Axel Voss is charged with making the EU's copyright laws fit for the Internet Age, yet in a staggering disregard for advice from all quarters, he decided to include a obligation on websites to automatically filter content.
Article 13 sets out how online platforms should manage user-uploaded content appears to have the most dangerous implications for fundamental rights. Never mind that the new Article 13 proposal runs directly contrary to an existing EU law -- the
eCommerce Directive - which prohibits member states from imposing general monitoring obligations on hosting providers.
Six countries -- Belgium, the Czech Republic, Finland, Hungary, Ireland, and the Netherlands -- sought advice from the Council's Legal Service last July, asked specifically if the standalone measure/obligation as currently proposed under Article
13 [would] be compatible with the Charter of Human Rights and queried are the proposed measures justified and proportionate? But this does not seem to have been addressed.
The aim of the rule, which is in line with the European Commission's proposals more than a year ago, is to strengthen the music industry in negotiations with the likes of YouTube, Dailymotion, etc. Under Voss' revised Article 13, websites and apps
that allow users to upload content must acquire copyright licenses for EVERYTHING, something that is in practice impossible. If they cannot, those platforms must filter all user-uploaded content.
The truth is that this latest copyright law proposal favors the rights-holders above anyone else. And we though MEPs represented the people.
Google has tweaked its Google's image search to make it slightly more difficult to view images in full size before downloading
them. Google has also added a more prominent copyright warning.
Google acted as part of a peace deal with photo library Getty Images. In 2017, Getty Images complained to the European Commission, accusing Google of anti-competitive practices.
Google said it had removed some features from image search, including the view image button. Images can still be viewed in full size from the right click menu, at least on my Windows version of Firefox. Google also removed the search by image
button, which was an easy way of finding larger copies of photographs. Perhaps the tweaks are more about restricting the finding of high resolution version of image rather than worrying about standard sized images.
Getty Images is a photo library that sells the work of photographers and illustrators to businesses, newspapers and broadcasters. It complained that Google's image search made it easy for people to find Getty Images pictures and take them, without
the appropriate permission or licence.
In a statement, Getty Images said:
We are pleased to announce that after working cooperatively with Google over the past months, our concerns are being recognised and we have withdrawn our complaint.