Host providers with actual knowledge of illegal activities must expeditiously (and worldwide) remove or disable access to the information, the ECJ held

 

On October 3, 2019 in Case C-18/18, Eva Glawischnig-Piesczek v. Facebook Ireland Limited, the European Court of Justice (EDJ) held that — under Directive 2000/31, the Directive on electronic commer – cefor a platform (host provider) to be considered hosting provider (and so benefit from liability exception), while it must play a passive role (having no knowledge of the content), must nevertheless be able to expeditiously remove or disable the illegal information when requested and must do so worldwide. Also the ECJ also held that EU member states should not impose a general monitoring obligation on platforms.  In particular, the ECJ found that:

 In order to benefit from a limitation of liability, the provider of an information society service, consisting of the storage of information, upon obtaining actual knowledge or awareness of illegal activities has to act expeditiously to remove or to disable access to the information concerned; the removal or disabling of access has to be undertaken in the observance of the principle of freedom of expression and of procedures established for this purpose at national level; this Directive does not affect Member States’ possibility of establishing specific requirements which must be fulfilled expeditiously prior to the removal or disabling of information. Decision at 46.

Member States are prevented from imposing a monitoring obligation on service providers only with respect to obligations of a general nature; this does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation. Decision at 47.

The ECJ specified, however, that the “Directive should not apply to services supplied by service providers established in a third country;”. Decision at 58.

Specifically, the ECJ was asked to determine the material and personal scope of a monitoring obligation which may be imposed, in the context of an injunction, on the provider of an information society service consisting in storing information provided by a recipient of that service (a host provider) under Directive 2000/31, the Directive on electronic commerce. The ECJ also had to address the question of the territorial scope of a removal obligation. The ECJ held a court of a Member State has the power to order the host provider to remove 1) ” information which it stores, the content of which is identical to the content of information which was previously declared to be unlawful, or to block access to that information”; 2) “information which it stores, the content of which is equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, provided that the monitoring of and search for the information concerned by such an injunction are limited to information conveying a message the content of which remains essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, and provided that the differences in the wording of that equivalent content, compared with the wording characterising the information which was previously declared to be illegal, are not such as to require the host provider to carry out an independent assessment of that content” and 3) “information covered by the injunction or to block access to that information worldwide within the framework of the relevant international law.”

The Court clarified that the Directive does not apply to “services supplied by service providers established in a third country”.

Decision is available here

FACTS

The applicant, Ms. Eva Glawischnig-Piesczek, a member of the Austrian Nationalrat (National Council, Austria) asked Facebook Ireland to delete a comment to an article posted on Facebook. The online content could be consulted by any user of the platform in question.

As Facebook Ireland did not remove the comment, the applicant brought an action before the Handelsgericht Wien(Commercial Court, Vienna, Austria) and requested that court issue an injunction ordering Facebook Ireland to cease publication of the photographs and the accompanying message.

The Handelsgericht Wien granted the remedy and ordered Facebook Ireland to disable access to the content in Austria.

On appeal, the Oberlandesgericht Wien (Higher Regional Court, Vienna, Austria) upheld the order made at first instance. The courts of first and second instance based their decisions on the Austrian law and held that “the public comment contained statements which were excessively harmful to the applicant’s reputation and gave the impression that she was involved in unlawful conduct, without providing the slightest evidence in that regard. Nor, according to those courts, was it permissible to rely on the right to freedom of expression for statements relating to a politician if there was no connection with a political debate or a debate that was in the public interest.”

On the last instance, the Oberster Gerichtshof (Supreme Court, Austria), referred a preliminary question to the European Court of Justice (ECJ) to adjudicate “whether the cease and desist order made against a host provider which operates a social network with a large number of users may also be extended, worldwide, to statements with identical wording and/or having equivalent content of which it is not aware.”

Opinion of advocate general

On June 4, 2019, Advocate General Szpunar suggested that the Directive on electronic commerce does not preclude a host provider which operates a social network platform from being ordered, in the context of an injunction, to identify and delete contents disseminated by users that are identical and equivalent to information deemed illegal. The Directive does not preclude that host provider from being ordered to remove the illegal information worldwide.

According to the Advocate General, the Court already made clear that the owner of a social network platform that stores the information provided by the users of that platform, relating to their profile, is a hosting service provider under Directive 2000/31. According to the Directive, such a host provider enjoys relative immunity from liability for the information which it stores and on condition that once made aware of the illegality of certain content, it acts expeditiously to remove the information or to disable access to it.

Article 15.1, Directive 2000/31, prohibits Member States from imposing a general obligation on providers of services whose activity consists in storing information to “monitor the information which they store or a general obligation actively to seek facts or circumstances indicating illegal activity.”

However, the immunity granted to an intermediary service provider does not prevent a court from requiring that service provider to terminate or prevent an infringement. Even if not liable itself for the information stored on its servers, the service provider may be the addressee of injunctions.

According to Advocate General Szpunar, “the targeted nature of a monitoring obligation should be envisaged by taking into consideration the duration of that monitoring and the information relating to the nature of the infringements in question, their author and their subject.” Those elements should be assessed in order to answer the question of whether an injunction does or does not comply with the general prohibition to impose general monitoring obligations on service providers. “A permanent monitoring obligation would be difficult to reconcile with the concept of an obligation applicable in a specific case”. However, a general monitoring obligation couldn’t be envisaged if a specific case of an infringement has actually been identified and the host provider is ordered to prevent any further infringement of the same type and by the same recipient.

In addition, according to the Advocate General, the prohibition of general monitoring obligations aren’t infringed if the host provider is asked to remove or block access to “statements identical to the statement characterized as illegal that are published by the same user.”

As far as the injunction to remove all statements that have “equivalent content to that characterized as illegal and are published by the same user”, the Advocate general Szpunar deems that “a court adjudicating, in the context of an injunction, on the removal of ‘equivalent information’ must thus respect the principle of legal certainty and ensure that the effects of that injunction are clear, precise and foreseeable. In doing so, that court must weigh up the fundamental rights involved and take account of the principle of proportionality.”

Finally, when asked his opinion on the territorial scope of a removal obligation and whether the host provider may be ordered to remove illegal content not only in the Member State that considers that content illegal but also worldwide, Advocate General Szpunar highlights that Directive 2000/31 doesn’t regulate the territorial scope of a removal obligation imposed on a host provider in the context of an injunction. Therefore, no provision precludes that host provider “from being ordered to remove worldwide information disseminated via a social network platform.” However, “the implementation of a removal obligation should not go beyond what is necessary to achieve the protection of the injured person.”

The full text of the opinion in Case C-18/18, Eva Glawischnig-Piesczek v. Facebook Ireland Limited is available at https://eur-lex.europa.eu….

For more information on how a data breach could impact your business Francesca Giannoni-Crystal and  Federica Romanelli