Law ain’t Code: Upload filtering technologies and the CDSM Directive

A new provision entered the scene of EU copyright regulation, deeply affecting the uploading, accessing, and enjoying of online content. Some considerations on the legal and technological impact of Article 17 of the Copyright in the Digital Single Market Directive.

When Lawrence Lessig published “Code and Other Laws of Cyberspace” in 1999, a new – often misunderstood – mantra was born: “Code is Law”. With his book (later updated to Code Version 2.0), Lessig helped us understand that cyberspace, despite its peculiar nature, was not beyond the reach of physical space regulation. Contrary to popular ideas at the time, which claimed that cyberspace should not – and could not – be regulated, Lessig argued that, in cyberspace, freedom would not come from the absence of the State and, if left to itself, cyberspace would be misused as a tool for control and restriction. As such, a new form of public legal ordering needed to be built for our fundamental values and rights to also be protected online. And it needed to be built within the maze of Codes.

Lessig argued that Code – both in its software and hardware formats – should be recognised as a regulatory component of the cyberspace, as it plays a de facto constitutive and key role in its normative architecture. Just as in the physical space natural and man-made laws guide and constrain our behaviour, in the cyberspace we act, communicate, exchange and behave according to what the Code allows or prohibits us to do.

The debate surrounding Article 17 of the EU Directive on Copyright in the Digital Single Market 2019/790 (CDSM Directive) is a prime example of the difficult relation between Code and law. In this Insight, I will explore how the law seems to mandate a fundamental change in the technical architecture of cyberspace, disregarding how Code works… and the heated confusion that has followed.

The problem: the proposed Art. 17 CDSM Directive

In sum, the objective of the proposal (then, Article 13), that later lead to Article 17 CDSM Directive was to make large content-sharing platforms (like YouTube or Vimeo) directly liable for their users’ uploads, in cases where the content made public violated someone’s copyrights. The idea was that platforms could avoid such burden of liability by either obtaining a specific authorisation from right holders that would cover the uses of copyrighted works by the platforms’ users, or by preventing the availability of copyright protected works on their services. This idea poses several problems.

First, it leads to a new specific liability regime for platforms, regarding copyright violations, which represents a departure from the so-called safe harbour provisions, i.e. the traditional intermediary liability limitations the e-Commerce Directive ensures for hosting providers (such as platforms like YouTube and Vimeo).

Second, introducing an obligation to prevent the availability of infringing works implicitly assumes there is a good and affordable way for all platforms to do this. The European Commission’s proposal expressly mentioned that Art. 17 leads to the use of “effective content recognition technologies”, better known as “upload filters”. This proposal was inspired by YouTube’s “Content ID”, a 100 million dollar worth content recognition software which only very large platforms could afford to acquire or replicate.

Lastly and most broadly, filtering what users can publish online is highly controversial from the perspective of safeguarding the fundamental right to freedom of expression on the Internet. It also raises questions of proportionality since existing filtering technologies can only recognise content, not make a legal evaluation of the lawfulness of its use. This has heated the copyright debate, which pivoted on the risks of over-enforcement of Art. 17 and the need to protect freedom of expression and legitimate uses of content online (for example, for parody, quotation or educational purposes).

Which upload filters? The adopted Art. 17

Picking up the opening thread of this Insight: Regulation in cyberspace is primarily imposed through Code – Lessig warned us – and building such architecture entails making choices. “Which values will we want to protect?” he asks. Surprisingly, EU lawmakers’ reply to this question was vague, if not completely missing, when it came to defining the nature and role of upload filters ex Art. 17.

In what seems to be an attempt to dissolve any concern about censorship, explicit references to “effective content recognition technologies” were removed from the text of the Directive. Both the European Commission and the European Parliament used this argument to publicly refute the idea that the Directive required platforms to introduce upload filters. However, as Advocate General Saugmandsgaard Øe points out (para. 60-62), and as agreed by the CJEU (para. 53-56), upload filters are a must if platforms want to comply with Art. 17.

Furthermore, as an additional safeguard, Art.  17(7) of the Directive was amended to include the requirement that any mechanisms used “shall not result in the prevention of the availability of works (…) which do not infringe copyright (…), including where such works (…) are covered by an exception or limitation”. Once again, what sounds like balanced legal wording faces the problem of not having, to date, viable technological applications. The most advanced upload filtering systems currently available on the market, such as Content ID, cannot identify the uses of copyrighted content that are permitted by EU and national law.

By and large, the question remains: how can platforms, on the one hand, fulfil an obligation to remove copyright-infringing content while, on the other hand, not removing the exact same content when its use is legitimate? If the law clearly mandates them to do so, upload filtering mechanisms have not provided the technological answer to this question yet.

Further proving this point, complaint and redress mechanisms were also added to the text of the Directive, as ex post measures that can fix problems of unlawful overblocking of content online. These mechanisms, however, serve as mere complementary mitigation measures, intervening as promptly as possible (yet always ex post) with respect to a violation of users’ fundamental rights.

The CJEU’s interpretation

This discussion resulted in a new, highly complex provision of EU copyright law. What started as a simple three-paragraph proposal became a two-page article, a “monstrosity of a provision” according to Dusollier and Jütte, among others, with conflicting objectives and huge interpretation challenges.

But the clash between Law and Code was now here to stay. As soon as the Directive was adopted, Poland challenged Art. 17 before the CJEU, rounds of stakeholder dialogues continued with ever more radically different interpretations of the article, and national implementations did not promptly flow in, partially because of the late publication of the EC guidelines to implement Art. 17, which, however, did not dissolve many of the doubts illustrated above (see Jütte/Priora).

In its judgement, the CJEU confirmed that Art. 17 entails a limitation of the freedom of expression and information, but considered such limitation to be justified under Art. 52(1) of the Charter and thus legitimate. In its reasoning, the Court put particular emphasis on the fact that the Directive sets forth strong safeguards to protect end-users in their legitimate uses of protected works. The CJEU firmly restated that the Directive does not allow for the use of automatic recognition and filtering tools “which filter and block lawful content when uploading”, citing case law where it held that “a filtering system which might not distinguish adequately between unlawful content and lawful content (…) would be incompatible (…) with Art. 11 of the Charter (para. 85-86). In other words, the Court considered that “Internet filters do not infringe freedom of expression if they work well. But will they?” (Husovec).

Lastly, the CJEU considered the use of upload filters “not only appropriate but (…) necessary to meet the need to protect intellectual property rights”, as other less restrictive measures “would not be as effective” (para. 83). Oddly, the necessity for a similar liability regime was not felt by the EU legislator in the recent Digital Services Act, regarding other types of illegal content (see recital 16; for the interplay between both, see Quintais/Schwemer). Even more surprisingly, the CJEU’s statement is not backed by a full reasoning. In its proportionality assessment, the CJEU did not assess the impact of the use of such filters on users’ fundamental rights, having been satisfied with a presumed better effectiveness of upload filters over other alternative options.

Conclusion

One of the main features of Code is not allowing margin for ambiguity: depending on the given inputs, the Code will run or will not run. Even the most basic Code does not run if there is a problem with it – the system will simply return an error. For example, in Python:

a = “upload filters”
b = copyright_infringement
print(a+ “, please detect”, b)

As there is something wrong with this Code (some “” missing), the system will return an error:

NameError: name ‘copyright_infringement’ is not defined

However, legislative fiction can afford to enter uncertain inputs without “errors” being displayed. If Art. 17 might be considered balanced and proportional in its literal reading, our current technological reality seems to be unable to achieve its pursued objectives. In other words, Art. 17 is legally (but not technologically) balanced. As such, Art. 17 carries the risk of a sub-optimal application, a remarkable expectation on platforms to apply it in a balanced and effective way, and an inevitable burden of interpretation on national Parliaments and courts. Just as law could give us the right to live forever and that would not make us immortal, the law may well state what the outcome of a certain technology should be but that does not mean the technology will achieve such outcome. Just because it is written in the law does not mean it will become a technical reality. Law is not Code. Lessig was right: in the cyberspace, Code is indeed a regulator.

Os Insights aqui publicados reproduzem o trabalho desenvolvido para este efeito pelo respetivo autor, pelo que mantêm a língua original em que foram redigidos. A responsabilidade pelas opiniões expressas no artigo são exclusiva do seu autor pelo que a sua publicação não constitui uma aprovação por parte do WhatNext.Law ou das entidades afiliadas. Consulte os nossos Termos de Utilização para mais informação.

Deixe um Comentário

Gostaríamos muito de ouvir a tua opinião!

Estamos abertos a novas ideias e sugestões. Se tens uma ideia que gostarias de partilhar connosco, usa o botão abaixo.