![]() |
https://bit.ly/2DV3vUX |
On 12 September 2018, all 751 Members of the European Parliament (MEPs) got a chance to shape the European copyright reform with a plenary vote.
The outcome: 366 MEPs blatantly ignored your calls asking them to #SaveYourInternet, as they adopted the copyright #CensorshipMachine.
What’s next: The JURI Committee Rapporteur, MEP Axel Voss, has been granted a mandate to start informal negations with the representatives of the EU Member States (Council) and the European Commission (EC), so-called ‘trilogue negotiations’, the black box in the EU policymaking process. See EDRi’s explainer for more details on the remainder of this process.
Article 13 only benefits big businesses
Due to the collateral damage created by the vague and overly broad wording of Article 13, only big platforms and powerful rightsholders will benefit from its adoption to the detriment of all other stakeholders.
Bad for Users
Users will have access to less content and will be unable to share their content with others, even when it’s legal. Moreover, any complaint mechanisms will be easily bypassed if blocking is done under the pretense of terms and conditions violation, rather than as a result of a specific copyright claim.
Bad for Creators
If platforms become directly liable for user-uploaded content they will arbitrarily remove content based on their terms and conditions. As a result, many creators will see their content get blocked too. And, as fewer platforms survive the burden of this provision, creators will have less choice on where to share their creations.
Bad for competition
Only platforms with deep pockets will be able to comply with the Article 13 requirements and even if small enterprises get an exemption from its scope, this simply means they are not allowed to scale up and compete with the big US platforms, under the motto ‘in Europe, small is beautiful
The world should be concerned about new proposals to introduce a system that would automatically filter information before it appears online. Through pre-filtering obligations or increased liability for user uploads, platforms would be forced to create costly, often biased systems to automatically review and filter out potential copyright violations on their sites. We already know that these systems are historically faulty and often lead to false positives.
Professor María Sefidari HuiciChair of the Wikimedia Foundation
By requiring Internet platforms to perform automatic filtering all of the content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users. (…) we cannot support Article 13, which would mandate Internet platforms to embed an automated infrastructure for monitoring and censorship deep into their networks. For the sake of the Internet’s future, we urge you to vote for the deletion of this proposal.
+70 Internet and computing luminariesOpen letter
Although the latest proposed versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright-protected works in vague terms, such as demonstrating ‘best efforts’ and taking ‘effective and proportionate measures.’ (…) I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity, and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire, and parody.
I think that the overfiltering problem is huge and the norms are so vague. Article 13 is doomed to failure. The Digital Single Market Directive draft is some speculation that if we put these really strict rules in place, all the tech companies and platforms that can afford to license content will do that. I think that’s naive.
Professor Pamela SamuelsonDirector of the Berkeley Center for Law & Technology - President of the Authors Alliance
The lesson, for me, is: Don’t tear down the building, be the landlord. It’s far more beneficial for me to embrace the community that is remixing my art, to set my own rules about how my work is used, and to embrace the shared creativity and profits that come from it. It wasn’t easy for me to adapt my thinking, but today I work with a number of online services to give fans what they want while still getting paid.
The concern of the vzbv: Out of fear of completely unclear liability rules many contents will disappear in the net. Dubious content, so-called fake news, on the other hand, will find it even easier to spread on the internet in the future.
Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens’ fundamental rights. (…) Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens’ communications if they are to have any chance of staying in business.
+50 NGOs representing human rights and media freedom Open Letter
Claim: The Internet will not be filtered
Assessment: Not true. Upload filters will become an obligation for platforms that want to enter the market. The distinction between the Internet and platforms is artificial. There is hardly any internet service without active user involvement. The spectrum of user-generated content ranges from newspaper websites, blogs and social networking sites to online forums and cloud solutions.
+200 academics from over 25 research centres Open Letter
Claim: There is no problem relating to freedom of expression
Assessment: Not true. (…) Article 13 motivates firms to use cheap upload filters which will block legitimate content. Complaint and redress mechanisms are insufficient to cope with this problem. Expressions such as permissible parodies will be affected.
+200 academics from over 25 research centres Open Letter
The world should be concerned about new proposals to introduce a system that would automatically filter information before it appears online. Through pre-filtering obligations or increased liability for user uploads, platforms would be forced to create costly, often biased systems to automatically review and filter out potential copyright violations on their sites. We already know that these systems are historically faulty and often lead to false positives.
Professor María Sefidari HuiciChair of the Wikimedia Foundation
By requiring Internet platforms to perform automatic filtering all of the content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users. (…) we cannot support Article 13, which would mandate Internet platforms to embed an automated infrastructure for monitoring and censorship deep into their networks. For the sake of the Internet’s future, we urge you to vote for the deletion of this proposal.
What’s at Stake?
Article 13’s various versions create a system whereby platforms face an increased (direct) liability for the content uploaded by their users if it infringes copyright. As a result, these platforms are likely to over block even legal content and use automated techniques to avoid being sued, which will mean users will no longer be able to share and experience the content they were used to find online.
Our Ability To Post Content On The Internet Will Be Limited By A Censorship Machine
Some of the content uploaded on the Internet infringes the copyright of rightholders (which are often not the content creators but intermediaries and investors such as recording or film studios) and content creators complain that due to the digital evolution, they make less money than they used to (the so-called ‘value gap’). This does not reflect the reality accurately, specifically in the case of the music industry that year after year announce that their incomes keep increasing. However, what they claim is that some platforms (YouTube, Vimeo… ) do not pay them enough when they stream copyrighted content: that is what they call the “value gap” (the gap between what rightsholders think would be fair as a compensation and what platforms pay them).
Article 13 claims to address these problems but does so it in a way that hampers the way the Internet has been functioning so far by asking platforms to put in place costly and opaque solutions to pre-screen our content. This proposal would require intermediaries such as Facebook and YouTube to constantly police their platforms with censorship machines, often with no human element involved in the process. It will mean that you will no longer be able to upload or enjoy the same content as you used to, as automated blocking is likely to stop (legitimate) content of ever making it online. Analyses by EDRi of the European Commission and JURI proposals show the underlying threats in Article 13’s logic.
And what’s worse: none of the versions of Article 13 make life better for creators. Article 13 actually makes no mention of creators: only rightholders.
What’s on the table?
Looking at both the European Parliament and Council drafts, 3 key flaws can be identified:
- the text is not balanced with fundamental rights;
- Article 13 puts an end to the e-Commerce Directive for a vast array of platforms, and this without a proper Impact Assessment; and,
- the provision makes platforms directly liable for user uploaded content, which implies upload filtering to avoid this liability.
Moreover, rightholders do not even have to identify the works that platforms need to take down and user safeguards have been reduced to a paper tiger, which will leave users without any recourse to push back against wrongful blockings. See the CopyBuzz.com analysis on an early draft of the adopted proposal, as well as this handy flowchart by COMMUNIA on it.
So what’s next?
The JURI Committee Rapporteur, MEP Axel Voss, has been granted a mandate to start informal negations with the representatives of the EU Member States (Council) and the European Commission (EC), so-called ‘trilogue negotiations’. These negotiations are often considered the black box in the EU policymaking process because they happen with little to no public accountability behind closed doors, and usually until late at night to broker a deal, without the negotiations documents being publicly available. Therefore, this process is subject to un-transparent horse trading. See EDRi’s explainer for more details on the remainder of this process. But so far nothing is set in stone. The fight against the #CensorshipMachine is thus far from over: we need to keep pressure both on MEPs and national governments to ensure that they find a sensible compromise in the end to #SaveYourInternet.
Source: https://bit.ly/2M8Xyo0