Pages

Wednesday, May 29, 2019

To Monitor or Not to Monitor? The Uncertain Future of Article 15 of the E-Commerce Directive

New Controversies in Intermediary Liability Law

Aleksandra Kuczerawy

Current policy discourse in the European Union is steadily shifting from intermediary liability to intermediary responsibility. The long-established principle prohibiting general monitoring obligations is currently being challenged by two initiatives in particular, namely the Copyright in the Digital Single Market Directive and the proposal on the Regulation preventing the dissemination of terrorist content online. By holding service providers liable and requiring broad implementation of censoring measures, this growing trend toward imposing monitoring obligations may have significant ramifications for the ability of individuals to freely share and access content online.

No General Monitoring Obligation

The limited liability regime for Internet intermediary services in the European Union is currently going through major changes. To date, the liability exemptions for Internet intermediary service providers have been governed by E-Commerce Directive (ECD). The Directive applies horizontally to various domains and to any kind of illegal or infringing content.

Under Article 15 of the ECD, E.U. Member States may not impose on intermediary service providers a general obligation to monitor information that they transmit or store. Further, Member States cannot introduce a general obligation to actively look for facts or circumstances indicating illegal activity. The prohibition of monitoring obligations (Recital 47) refers solely to monitoring of a general nature. It does not concern monitoring obligations in a specific case; nor does it affect orders issued by national authorities in line with national legislation.

The ECD, moreover, does allow Member States to require hosting providers to apply duties of care. Such duties of care, however, should only be introduced to detect and prevent certain types of illegal activities, foreseen by national law. The Directive does not specify what exactly such duties of care entail. As a result, the boundary between duties of care and general monitoring is not clear.

Per Article 15, Member States are not allowed to introduce obligations that would require intermediary service providers to systematically monitor the information they store or transmit. This does not mean that service providers cannot take up such activities on their own initiative. Most of the service providers in the European Union do perform certain voluntary monitoring activities in order to maintain a “civilized” environment on their platforms. Voluntary monitoring, however, can prove detrimental. Exercising too much control could compromise the neutral status of the intermediary service provider and, in consequence, deprive them of the safe harbour protection provided by Article 14 of the ECD. The E.U. intermediary regime does not contain a provision which protects service providers from liability should their voluntary monitoring prove imperfect (such as the one offered by the Section 230(c)(2) of the Communications Decency Act in the United States). The lack of a Good Samaritan-style protection results in a certain level of prudence on the side of the E.U. service providers when administering the monitoring activities on their platforms.

Current policy discourse in the European Union is steadily shifting from intermediary liability to intermediary responsibility. The amended Audiovisual Media Services Directive, the Directive on copyright in the Digital Single Market, and the proposal for a Regulation on preventing dissemination of terrorist content online, as well as recent soft law initiatives, such as the Code of Conduct on Countering Illegal Hate Speech Online and the most recent Code of Conduct on Disinformation, are all examples of that trend.

The former is understood as a negligence-based approach, while the latter emphasizes the need for proactive measures. The long-established principle prohibiting general monitoring obligations is currently being challenged by two initiatives in particular, namely the Copyright in the Digital Single Market Directive and the proposal on the Regulation preventing the dissemination of terrorist content online.

Copyright in the DSM Directive

In mid-February 2019, at the end of the so called “trilogue” procedure, the negotiating E.U. institutions reached a compromise on the text of the Copyright in the DSM Directive. The text passed the vote by the European Parliament on 26 March 2019 and was ultimately approved by the Council on 15 April 2019. Article 17 (formerly Article 13) of the Copyright in DSM Directive includes a substantial change to the established intermediary liability regime, in that it makes service providers directly liable for the content uploaded by their users. To avoid liability, service providers must enter into licensing agreements with any owners of any content they may possibly host. Per Article 17.4b,
if the service provider does not want to (or is not able to) pay licensing fees, they have to demonstrate they made best efforts to ensure the works are unavailable “in accordance with high industry standards of professional diligence.” Only then may they be able to escape direct liability for the content of their users. The Copyright in DSM Directive stipulates that the application of Article 17 “shall not lead to any general monitoring obligation.” Interestingly, the final text of the provision no longer refers specifically to Article 15 of the ECD. The provision clearly aims to pacify the numerous critics of the Directive. Despite the insistence that Article 17 will not lead to general monitoring obligation, it is hard to imagine in what other way service providers can ensure copyrighted works are not made available without proper licensing. To effectively recognize infringing content, a technological tool must be used to examine all newly uploaded content on the platform and comparing it with an existing database. This amounts to installing upload filters by the service providers and systematic monitoring of the entirety of the users’ content. Despite repeated attempts to convince the broad public that the Copyright in DSM Directive was not meant to introduce upload filters, several officials admitted, soon after the vote, that the filters are unavoidable.

Proposal for a Regulation on Preventing Dissemination of Terrorist Content Online

In September 2018, the European Commission issued a proposal for a Regulation on preventing dissemination of terrorist content online. Apart from one-hour content removal orders (Article 4) and content referrals (Article 5), the proposed regulation provides that service providers should take “proactive measures to protect their services against the dissemination of terrorist content” (Article 6). According to the European Commission’s proposal in Article 6, service providers are expected to check against publicly or privately-held tools containing known terrorist content. They may also use “reliable technical tools to identify new terrorist content,” either using those available on the market or those developed by the service provider. If the competent authority considers the taken measures insufficient, it may request that the provider takes “specific additional proactive measures.” If no agreement can be reached, the competent authority has the power to impose “specific additional (…) proactive measures.”

Recital (19) of the European Commission’s proposal states that imposing “specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor,” as prohibited by Article 15 of the ECD. After this optimistic note, the proposal explains that in light of the “particularly grave risks associated with the dissemination of terrorist content,” the decisions adopted on the basis of the Regulation could, in fact, derogate from Article 15’s prohibition. The derogation would apply to “certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons.” It would seem, therefore, that a general monitoring obligation may very well be the intended outcome of the Regulation, in contradiction with Article 15.

On 17 April 2019, the European Parliament adopted its Report on the proposed Regulation. The Parliament Report made significant changes to the whole proposal and to Article 6, which no longer mentions “proactive” measures. Instead, Article 6 now states that the service providers “may take specific measures” that should be “effective, targeted and proportionate.” The amended Article 6.4 provides that competent authority “may send a request for necessary, proportionate and effective additional specific measures” to hosting providers that have received a substantial number of removal orders. However, the competent authority “shall not impose a general monitoring obligation, nor the use of automated tools.” The Parliament Report also amended Recital (19) of the proposed Regulation. The new version of the recital does not contain the worrying statement that Article 15’s prohibition could be derogated from due to overriding public security reasons.

The changes introduced by the Parliament Report eliminate the most significant threats to the Internet freedoms. The issue, however, is not yet resolved. The proposed Regulation will continue to proceed through the “trilogue” procedure, where proactive measures may still be reintroduced by the Council of the European Union.

Outlook

According to the Court of Justice of the E.U., as described in Scarlet v. SABAM and SABAM v. Netlog, a requirement to install a filtering system capable of identifying specific types of content, for almost all information stored by the users, applied indiscriminately to all of them, as a preventive measure, and for unlimited period of time, amounts to a general monitoring obligation. It is difficult to imagine how the “best efforts” and “proactive measures,” as envisaged by the Copyright in DSM Directive and the Terrorist Content Regulation respectively, would not constitute a general monitoring obligation. Despite multiple warnings that the proposed measures would undermine the E.U. acquis, policy makers are still moving forward. One therefore cannot help but wonder if the times of no general monitoring obligations are coming to an end. By holding service providers liable and requiring broad implementation of censoring measures, this growing trend toward imposing monitoring obligations may have significant ramifications for the ability of individuals to freely share and access content online.

Dr. Aleksandra Kuczerawy is a postdoctoral researcher at the Centre for IT and IP Law at the KU Leuven, Belgium. She is the author of Intermediary Liability and Freedom of Expression in the EU: From Concepts to Safeguards (Intersentia, 2018). She can be reached at aleksandra.kuczerawy at kuleuven.be.