Quantcast
Channel: SpicyIP
Viewing all articles
Browse latest Browse all 2950

CJEU Endorses ‘Notice and Stay Down’ For Illegal Online Content

$
0
0

Image by EFF.

In a major ruling on October 3, 2019, the CJEU in Case C-18/18Eva Glawischnig-Piesczek v Facebook Ireland Limited, held that national courts may direct injunctions against online content-hosting intermediaries like Facebook, which require the intermediary to block the specific content globally, as well as blocking identical or equivalent content or information. This marks a crucial turn in online content regulation in the EU, as it explicitly endorses a ‘notice-and-stay down’ standard for content removal, and endorses upload filters for online content, the difficulties of which we have written about here and here.

Case Background

The genesis of the dispute lies in a Facebook post containing the photograph of Austrian politician Eva Glawischnig-Piesczek, a member of the Austrian ‘Greens’ Party. The post in question linked to a news article along with certain commentary on Ms. Glawischnig-Piesczek, which she requested Facebook to take down. Facebook did not comply with the take-down request, and the complainant referred the dispute before the Austrian Commercial Court, which directed Facebook to remove the specific post as well as (a) identical posts and (2) information ‘having an equivalent meaning as that of the comment’, which would have the effect of violating the complainant’s personality rights.

The Commercial Court’s ruling was ultimately appealed up to the Austrian Supreme Court, which referred the following questions of law to the CJEU –

‘(1) Does Article 15(1) of Directive [2000/31] generally preclude any of the obligations listed below of a host provider which has not expeditiously removed illegal information, specifically not just this illegal information within the meaning of Article 14(1)(a) of [that] directive, but also other identically worded items of information:

–       worldwide; 

–       in the relevant Member State;

–       of the relevant user worldwide;

–       of the relevant user in the relevant Member State

(2) In so far as Question 1 is answered in the negative: does this also apply in each case for information with an equivalent meaning?

(3) Does this also apply for information with an equivalent meaning as soon as the operator has become aware of this circumstance?’ 

Article 14 of the e-Commerce Directive, which governs intermediary liability in the EU, exempts hosting intermediaries from liability in instances where it does not have knowledge of the illegal activity or information, or acts expeditiously to remove or to disable access to that information when it is made aware of the same.

Article 15 of the e-Commerce Directive provides that: ‘Member States shall not impose a general obligation on providers … to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.’ Article 15 explicitly limits EU states from requiring intermediaries to be performing general search and surveillance obligations to filter hosted content. However, as per recital 47 of the directive, this “does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation.”

The CJEU Ruling and It’s Global Implications for Free Speech and Privacy

The CJEU ruled that EU law does not preclude –

  • a host provider such as Facebook from being ordered to remove identical and, in certain circumstances, equivalent comments previously declared to be illegal.’ This is the endorsement of the notice-and-stay down requirement. It implies that an injunction may be directed not against a specific piece of content, but any reproductions of that content, without gaining specific knowledge (for example, through a distinct notice procedure) of that content. However, the ECJ goes even further and requires that the host provider must also remove equivalent The court reasons that the illegality of the content of information does not in itself stem from the use of certain terms combined in a certain way, but from the fact that the message conveyed by that content” and for an injunction to be effective, it must also be directed towards “information, the content of which, whilst essentially conveying the same message, is worded slightly differently, because of the words used or their combination, compared with the information whose content was declared to be illegal.”
  • ‘An injunction from producing effects worldwide, within the framework of the relevant international law which it is for Member States to take into account.’ There is sparse rationale provided for this, particularly given the more conservative approach adopted by the court in its recent ruling on the Right to Be Forgotten/de-listing obligation under the GDPR. The Court does not foreclose the possibility of global takedowns, but nor does it seems to offer any guidance to the member states on the scope of such global takedowns and what ‘relevant international law’ must be taken into account here, or what role the e-Commerce directive has in determining such scope.

The present ruling stems from increasing tensions globally concerning the inaction of by social media companies in light of illegal online information. The most remarkable aspect of this case is the move from requiring specificity of content takedowns (for example, through a notice-and-takedown provision which details the particular matter which is illegal), the law in the EU now allows general takedowns for both ‘identical’ and ‘equivalent’ information, thus requiring intermediaries and platforms (instead of courts and lawmakers) to make broad determinations of the legality of content usually without adequate context and certainly without political authority for the same.

While this marks a clear acknowledgement that the EU is willing to subject intermediaries to more stringent rules for preserving individual rights, at the same time, the ruling offers little balance on considerations of issues of freedom of expression. In particular, the ‘equivalency’ requirement is incredibly broad and paves the way for over-filtering of legitimate content. Further, In ruling that the E-Commerce directive does not preclude extra-territorial or global effects, without examining the precise scope of these effects, it also undermines judicial comity and international human rights standards on the freedom of expression and privacy (outside of the EU).

Of particular concern is the court’s nonchalant endorsement of automated filtering requirements is of concern – in effect the only manner in which the objectives of this ruling can be enforced. In Paragraph 46, the Court notes – ‘(protection against defamation) … is not provided by means of an excessive obligation being imposed on the host provider, in so far as the monitoring of and search for information which it requires are limited to information containing the elements specified in the injunction, and its defamatory content of an equivalent nature does not require the host provider to carry out an independent assessment, since the latter has recourse to automated search tools and technologies.’

Even as the limitations of automated technologies and ‘upload filters’ have repeatedly been brought to the notice of lawmakers and judges, particularly in recent debates concerning the updated Copyright Directive, there has been little effort to grapple with the legal consequences of these technologies, which are viewed as straightforward panaceas for complex determinations of law and policy. A more detailed analysis of the effects of this ruling on the freedom of expression and privacy are available here.

 

 


Viewing all articles
Browse latest Browse all 2950

Trending Articles