Dossier 0: Archive

Is Article 13 really the end of the open internet?

There has been a great deal of controversy about the new EU Copyright Directive. Much of this controversy has focused on the famous “article 13”. If you listen to the opponents of the proposal, it is the end of the internet as we know it. The internet where everyone can communicate with everyone, without permission, would be gone. If you listen to its supporters, it will finally give creators and artists more control of their rights online, after years of copyright abuse facilitated by internet giants. Amidst all the noise, it is hard to understand what on earth is going on.

The reason why there is such noise around the topic is because the proposal is very complex, not least due to the fact that article 13 is not actually one proposal. It is a 1,200 word proposal on filtering, on internet platform liability, on licensing, on redress mechanisms, and on cooperation between rightsholders and internet platforms. This is exacerbated by the fact that it tries to rebalance three different relationships—platforms and users, platforms and copyright holders and users and copyright holders, even though, of course, many of us are both copyright holders and users. Balanced discussion is also made more difficult due to the contradiction between what creators hope the proposal will bring them and the proposal’s vagueness and possible unintended consequences, which its opponents foresee.

The hope of those who support the proposal is that the largest platforms would be required to either pay for use of copyrighted content that is uploaded or block content that is not authorised to be uploaded. Is this hope justified?

The Proposal

What content is being protected

The proposal refers to “works or other protected subject matter”. This means anything that is subject to intellectual property rights protection—copyright, and also other rights protected under the broad notion of “intellectual property”. This covers the obvious things like video and audio, but also less obvious things like choreography and trademarks.

Which internet companies will be affected

All internet companies, except those that are small and less than three years old, that host, promote and, for profit, “organise” “large amounts of works or other subject matter” for their users, are covered. Unfortunately, no one knows what any of this really means. The reference to “organising” echoes a European Court case involving The Pirate Bay, so it will be up to the varying wisdom of the courts of 27 EU Member States to define what that means in relation to more traditional services. Nobody knows what “large amounts” means either. One thing is clear, however: despite all of the political spin to the contrary, this is not only about music and video nor only about Google and Facebook.

What are they supposed to do

Platforms are supposed to use “best efforts” to prevent future uploads of any material that has been subject to a “sufficiently substantiated notice” by people or companies claiming to have rights over the material. This can be audio, video, text, images of text, images, photographs, choreography, etc. Platforms would also have to report to rightsholders about the effectiveness of the technologies they use to achieve this goal. The good news for those making false copyright claims is that this is not punished by the Directive. This means anyone can make a claim about any content without being penalized. Even under the existing legal framework, it is so easy and without risk to make false claims that fraudsters have recently started demanding money from YouTube channels, in return for not submitting false copyright claims. Article 13 expands the power of those making complaints, but does nothing to make them more accountable.

What types of use will be banned

In our society, freedom is the default, and restrictions, including copyright, are the exception. In order to ensure that copyright restrictions are not excessive, exceptions are provided for. For example, there are optional exceptions in EU law for allowing the use of copyrighted content for private use, parody, education, etc. Under article 13, filters would need to be able to recognise these exceptions. They cannot. As a result, these crucial exceptions will be impossible to use.

In addition, there are absurd laws, which have almost never been applied, that will suddenly become easy to enforce. In some countries in the EU, it is a copyright infringement to take photos or videos of, or even paint, buildings or sculptures that are in public places. So, for instance, if you are a photographer and your photo includes a building designed by an architect that wishes to claim their right, then your picture could be prevented from being uploaded. (Similarly, memes frequently use copyrighted content, so they could also be filtered out, if the rightsholder requested this.)

Under article 13, filters would not only need to be able to recognise and block all “identified” buildings, they will have to take into account national copyright regimes. Under French, Italian and Slovenian rules, which have limited or no freedom of panorama, the filters would have to block the picture. Under Austrian, British and Irish rules, the picture could be allowed online. How a platform that operates in all of these countries is meant to apply its filters, is anybody’s guess.

Regardless of the lawmakers’ intention, platforms will obviously take the easy route of not taking into account all these exceptions and national contexts, and just block anything that might be an infringement. Blocking creates no legal risk, while allowing content to be published does create a risk.

Will users have a way to complain

Even for original drafters of article 13, this all seemed somewhat one-sided, so it was decided that some sort of balance should be added. It is therefore proposed in article 13 that, when users’ information is deleted, they should have access to a redress mechanism. Or, to be precise, if the platform admits to the user that their data was deleted on the basis of a national implementation of article 13.1 of the Copyright Directive, the user would have the right to redress. If, however, the platform chooses to say that the deletion was as a result of a terms of service violation, then the platform does not have to go to the expense of setting up or implementing the redress mechanism. So, there is a redress mechanism in the Directive, but it will not exist in the real world. The promise that users will have a meaningful way to object and assert their rights is simply not true.

The Consequences

Who will the proposal benefit

So, which platforms are likely to be able to cope with the unpredictable demands and unknown costs of complying with these rules? Small startups, or Google and Facebook? Which rightsholders will be able to feed the right information to every relevant hosting service in every relevant country? Independent artists and creators, or the biggest rightsholders? Guess what: the biggest rightsholders and the biggest platforms are the only ones that can reasonably be expected to cope.

Where does that leave an independent creator or artist

Simply put, independent creators will be stuck behind three gatekeepers: the platforms, the collecting societies and the filtering companies.

First of all, the creator’s freedom and negotiating power will decline. The big platforms are the only ones who can survive in this legal chaos. So, creators go from being able to share their content anywhere they want, to increasingly being restricted to using a few quasi-monopoly providers. These platforms are within their rights to block anybody’s work, if they so wish. They are free not to enter into a licensing agreement and block the content instead, if they so wish. They are free to remove content on the basis of false claims, if they so wish. They hold all of the negotiating cards. In order to connect with existing or new audiences, creators and artists will become even more dependent on these few platforms—and subject to their whims.

So, the only way for creators to meaningfully engage with this system is to join forces and work with a collecting society, a second gatekeeper. These companies will license, identify and report on the creator’s behalf, and take a cut of the revenue. This will be especially damaging to new artists, who will be negotiating contracts with collecting societies at a time when their bargaining power is at its lowest. If they choose not to work with a collecting society, artists are alone, negotiating with Google for revenue, updating Google’s blocking database to prevent their content from being uploaded, fighting false claims of ownership of their work or trying to get unjustified takedowns overturned.

As if this was not bad enough, there are very few companies that provide the technology necessary to fulfil the obligations of the Directive with regard to filtering uploads. Whoever puts a picture, a sound or a video clip onto the databases of these companies first, will have control of any subsequent uses, remixes or parodies. Therefore, the filtering companies will be a third new gatekeeper between artist and audience. This means that there will always be a risk that someone has done something similar to you, which will prevent your content from becoming available. We already see this happening. In one particularly absurd example, thirty seconds of a nine-minute video of a microphone being tested was “identified” by YouTube’s ContentID as being the creative work of somebody else— and blocked.

Under article 13, we move from an internet where artists can connect with their audience on their own terms, to a world where an artist has to license their rights to a collecting society, who will license their rights (or not) to a tiny number of US online platforms, who will have the right and opportunity to set the terms of the agreement. Because if an agreement isn’t reached in the favour of the platform, the platform will simply choose to ban availability of the content.

This is not about copyright infringements, but about controlling speech.

This is not about infringements. Indeed, there is virtually no mention of infringements in the proposals on article 13. It is about “identification”. It is about giving power and control to intermediaries like collecting societies. It is about giving obligations to online companies that only the biggest will have the resources to cope with. The harm and burden will be passed on to individuals. We will have to guess what will be allowed or not and we will self-censor, knowing that we are powerless faced with the giant corporations in charge.

In summary, badly defined platforms, fearful of unclear liability rules, are meant to use undefined technologies in order to do “enough” (it’s unclear what enough is) to prevent the availability of content that has been identified by rightsholders. Those making the “identification” run no legal risk because there is no penalty for incorrect claims. The platform runs no legal risk if it deletes everything that might cause a problem. All harms and all risks fall on the uploader. A lot of unknowns, you say? These are just the ones we are aware of.

Conclusion

If article 13 was holding the biggest platforms to account, it would be a good thing. If article 13 was ensuring that artists could reach their audiences more easily, it would be a good thing. If article 13 was reducing the number of gatekeepers, giving more power to smaller artists, allowing them to get paid more easily for their work, it would be a good thing. Instead, it is dismantling the internet as we know it, strengthening the strong, and creating legal chaos. Many opponents of article 13, including in civil society, support the goals of the proposal to give more power and more control to artists. Sadly, there is little evidence that the proposal has any chance of achieving this goal.

As the International Federation of Journalists said: “the Copyright Directive makes a mockery of journalists’ authors’ rights by promoting buy-out contracts and bullying to force journalists to sign away their rights and giving publishers a free ride to make more profits while journalists receive zero”. This analysis applies across the board. Article 13 is not made for artists.


✨This is the first in a series of seven commissioned essays for 2019. With these original essays, our aim is to publish work that engages with digital visual culture, both in its niche manifestations and within the technological, political, and mainstream reality of the internet.

About Joe: Joe McNamee, former Executive Director of European Digital Rights (EDRi), claimed a space in Brussels and the heart of the European Union for digital fundamental rights to be heard. EDRi has fought excessive copyright regulations in the EU—most recently against Articles 13 and 11. EDRi has also worked for Europe’s net neutrality rules, against privatised law enforcement, and was instrumental in the bruising lobbying battle over the GDPR, the “General Data Protection Regulation” that increased digital privacy for people in Europe and beyond. McNamee joined EDRi in 2009, at a time when there were no digital rights advocacy groups based in Brussels, despite the importance of EU decision-making for global digital freedom. During the nine years since, EDRi has grown to become an established part of digital rights policy-making. Prior to joining EDRi, McNamee worked for eleven years on Internet policy, including for the European Internet Services Providers Association. He started his Internet career working on the CompuServe UK helpdesk in 1995.