Under disgraceful plans set out last year by the European Commission, news publishers would get extra rights over their content, giving them the right to charge and licence publishers seeking to use snippets or short quotes from articles. The
policy has been dubbed 'the link tax'.
Now a key committee of the European Parliament, the Industry, Research and Energy Committee, wants to extend the proposals so that these rights would also cover publishers of academic research. Surely a nightmare for open access and open science.
Researchers might have to pay, or might at least have to ask for permission, every time they want to quote another academic's work in their piece.
If the proposed ancillary right is extended to academic publications, researchers, students and other users of scientific and scholarly journal articles could be forced to ask permission or pay fees to the publisher for including short quotations
from a research paper in other scientific publications, according to an open letter from Science Europe.
But even if this latest amendment is not adopted, the wider plan could still make it much harder for everyone, including researchers, to include quotations from news articles in their work, the organisation fears. For example, students might have
to buy a licence for every newspaper quote they use in a thesis. Links to news and the use of titles, headlines and fragments of information could now become subject to licensing. Terms could make the last two decades of news less accessible to
researchers and the public, leading to a distortion of the public's knowledge and memory of past events.
Next week, MEPs on the European Parliament's powerful Civil Liberties committee will vote on whether to approve the Link Tax and mass content filtering. With your help we've been relentlessly fighting to put a stop to this disastrous duo of
copyright policy, and this is what all that pressure and hard work comes down to.
Let's be clear: these proposals are abusing copyright to censor the Internet. Backed by powerful publishing lobbyists and unelected European Commissioners, they include sweeping powers for media giants to charge fees for links, and requirements
that websites build censorship machines to monitor and block your content. But with the help of tens of thousands of EU citizens, we've made clear to the European Parliament just how dangerous and unpopular these censorship proposals really are.
The European Commission has a well-deserved reputation for bizarre, destructive, ill-informed copyright plans for the internet , and the latest one is no exception: mandatory copyright filters for any site that allows the public to post
material, which will algorithmically determine which words, pictures and videos are lawful to post, untouched by human hands.
These filters already exist, for example in the form of Youtube's notoriously hamfisted Content ID system, which demonstrates just how bad robots are at figuring out copyright law. But even if we could make filters that were 99% accurate, this
would still be a catastrophe on a scale never seen in censorship's long and dishonorable history: when you're talking about hundreds of billions of tweets, Facebook updates, videos, pictures, posts and uploads, a 1% false-positive rate would
amount to the daily suppression of the entire Library of Alexandria, or all the TV ever broadcast up until, say, 1980.
Companies including Google and Facebook could face repressive legislation if they don't proactively remove illegal content from their platforms that is deemed illegal. That's according to draft EU censorship rules due to be published at the end
of the month, which will require internet service providers to significantly step up their actions to address the EU's demands.
In the current climate, creators and distributors are forced to play a giant game of whac-a-mole to limit the unlicensed spread of their content on the Internet.
The way the law stands today in the United States, EU, and most other developed countries, copyright holders must wait for content to appear online before sending targeted takedown notices to hosts, service providers, and online platforms.
After sending several billion of these notices, patience is wearing thin, so a new plan is beginning to emerge. Rather than taking down content after it appears, major entertainment industry groups would prefer companies to take proactive action.
The upload filters currently under discussion in Europe are a prime example but are already causing controversy .
The guidelines are reportedly non-binding but further legislation in this area isn't being ruled out for Spring 2018, if companies fail to address the EU's demands.
Interestingly, however, a Commission source told Reuters that any new legislation would not change the liability exemption for online platforms. Maintaining these so-called safe harbors is a priority for online giants such as Google and Facebook
203 anything less would almost certainly be a deal-breaker.
The guidelines, due to be published at the end of September, will also encourage online platforms to publish transparency reports. These should detail the volume of notices received and actions subsequently taken. The guidelines contain some
safeguards against excessive removal of content, such as giving its owners a right to contest such a decision.
If you happen to be a fan of the heavy metal band Isis (an unfortunate name, to be sure), you may have trouble ordering its merchandise online. Last year, Paypal suspended a fan who ordered an Isis t-shirt , presumably on the false assumption
that there was some association between the heavy metal band and the terrorist group ISIS.
Then last month Internet scholar and activist Sascha Meinrath discovered that entering words such as "ISIS" (or "Isis"), or "Iran", or (probably) other words from this U.S. government blacklist in the description
field for a Venmo payment will result in an automatic block on that payment, requiring you to complete a pile of paperwork if you want to see your money again. This is even if the full description field is something like "Isis heavy metal
album" or "Iran kofta kebabs, yum."
These examples may seem trivial, but they reveal a more serious problem with the trust and responsibility that the Internet places in private payment intermediaries. Since even many non-commercial websites such as EFF's depend on such
intermediaries to process payments, subscription fees, or donations, it's no exaggeration to say that payment processors form an important part of the financial infrastructure of today's Internet. As such, they ought to carry corresponding
responsibilities to act fairly and openly towards their customers.
Unfortunately, given their reliance on bots, algorithms, handshake deals, and undocumented policies and blacklists to control what we do online, payment intermediaries aren't carrying out this responsibility very well. Given that these private
actors are taking on responsibilities to help address important global problems such as terrorism and child online protection , the lack of transparency and accountability with which they execute these weighty responsibilities is a matter of
The readiness of payment intermediaries to do deals on those important issues leads as a matter of course to their enlistment by governments and special interest groups to do similar deals on narrower issues, such as the protection of the
financial interests of big pharma, big tobacco, and big content. It is in this way that payment intermediaries have insidiously become a weak leak for censorship of free speech .
Cigarettes, Sex, Drugs, and Copyright
For example, if you're a smoker, and you try to buy tobacco products from a U.S. online seller using a credit card, you'll probably find that you can't. It's not illegal to do so, but thanks to a "voluntary" agreement with law
enforcement authorities dating back to 2005, payment processors have effectively banned the practice--without any law or court judgment.
Another example that we've previously written about are the payment processors' arbitrary rules blocking sites that discuss sexual fetishes , even though that speech is constitutionally protected. The congruence between the payment
intermediaries' terms of service on the issue suggests a degree of coordination between them, but their lack of transparency makes it impossible to be sure who was behind the ban and what channels they used to achieve it.
A third example is the ban on pharmaceutical sales . You can still buy pharmaceuticals online using a credit card, but these tend to be from unregulated, rogue pharmacies that lie to the credit card processors about the purpose for which their
merchant account will be used. For the safer, regulated pharmacies that require a prescription for the drugs they sell online, such as members of the Canadian International Pharmacy Association (CIPA), the credit card processors enforce a blanket
Finally there are "voluntary" best practices on copyright and trademark infringement. These include the RogueBlock program of the International Anti-Counterfeiting Coalition (IACC) in 2012, about which information is available online,
along with a 2011 set of "Best Practices to Address Copyright Infringement and the Sales of Counterfeit Products on the Internet," about which no online information is found. The only way that you can find out about the standards that
payment intermediaries use to block websites accused of copyright or trademark infringement is by reading what academics have written about it .
Lack of Transparency Invites Abuse
The payment processors might respond that their terms of service are available online, which is true. However, these are ambiguous at best. On Venmo, transactions for items that promote hate, violence, or racial intolerance are banned, but there
is nothing in its terms of service to indicate that including the name of a heavy metal band in your transaction will place it in limbo. Similarly, if you delve deep enough into Paypal's terms of service you will find out that selling tickets to
professional UK football matches is banned , but you won't find out how this restriction came about, or who had a say in it.
Payment processors can do better. In 2012, in the wake of the payment industry's embargo of Wikileaks and its refusal to process payments to European vendors of horror films and sex toys, the European Parliament Committee on Economic and Monetary
Affairs made the following resolution :
[The Committee c]onsiders it likely that there will be a growing number of European companies whose activities are effectively dependent on being able to accept payments by card; [and] considers it to be in the public interest to define
objective rules describing the circumstances and procedures under which card payment schemes may unilaterally refuse acceptance.
We agree. Bitcoin and other cryptocurrencies notwithstanding, online payment processing remains largely oligopolistic. Agreements between the few payment processors that make up the industry and powerful commercial lobbies and governments,
concluded in the shadows, can have deep impacts on entire online communities. When payment processors are drawing their terms of service or developing algorithms that are based on industry-wide agreements, standards, or codes of
conduct--especially if these involve governments or other third parties--they ought to be developed through a process that is inclusive, balanced and accountable .
The fact that you can't use Venmo to purchase an Isis t-shirt is just one amusing example. But the Shadow Regulation of the payment services industry is much more serious than that, also affecting culture, healthcare, and even your sex life
online. Just as we've called other Internet intermediaries to account for the ways in which their "voluntary" efforts threaten free speech , the online payment services industry needs to be held to the same standard.