Publications & Insights The obligations on social media platforms to remove defamatory content may be increasing
Share This

The obligations on social media platforms to remove defamatory content may be increasing

Wednesday, 17 July 2019

68% of Irish businesses have a social media account. Multiple tweets and Facebook posts are published by and about these businesses on a daily basis.

But what obligations do the social media platforms have when the material being posted and shared is defamatory? This was recently considered in an interesting decision by the Advocate General to the European Court of Justice. If his opinion is followed by the European Court, it could extend the existing legal obligations on these platforms[1].

What is the current legal position?

Social media platforms are, in principle, not liable for online content shared by others, unless they are made aware that it is defamatory[2]. Immunity is granted as social media platforms are not deemed to have knowledge or control over the information transmitted on their platforms, unless an issue is specifically brought to their attention[3]. This can be contrasted with newspapers and television networks which are held accountable for everything they publish. 

What did the Advocate General’s opinion relate to?

The Advocate General’s opinion related to an injunction obtained in Austria requiring Facebook to remove defamatory content[4]. The issues to be considered by the European Court are whether or not the injunction could apply to identical and/or equivalent content, and be extended worldwide.  

What did he say?

  1. Can a Court order a social media platform to remove identical information to that which is deemed to be illegal?
    Yes. The Advocate General held that the EU Directive does not prevent a social media platform from being ordered, in injunctive proceedings, to seek and identify, amongst all of its users, identical information to that which has been found to defamatory by the Court that issued the injunction.

    The Advocate General was satisfied that this approach does not present an extraordinary burden, and balanced competing fundamental rights, such as privacy and personality rights, along with freedom to conduct business, freedom of expression and information. 

  2. Does the obligation to remove cover equivalent information?
    Yes, but only where the host provider is aware of it. The Advocate General held that the EU Directive does not prevent a host provider being ordered to seek and identify equivalent information. This is information that “scarcely diverges from the original information” or where “the message remains essentially unaltered”. Such an obligation is limited to information published by the same user (not all users) that published that defamatory content. The removal obligation for equivalent information only arises when the host provider is made aware of that equivalent information, and because of this, a general monitoring obligation does not arise. 
     
  3. Can a removal order extend worldwide or only within the EU?
    Yes, it can extend worldwide. The Advocate General held that the EU Directive does not regulate the territorial scope of an obligation to remove information published on a social media platform, nor is it regulated by EU laws in the context of defamation. An approach of self-limitation was adopted by the Advocate General, and he cautioned that the removal obligation should not go beyond what is necessary to protect the injured party. Depending on the case, the Court may decide to disable content through geo-blocking, rather than require worldwide removal.


What does this mean?

The Advocate General’s opinion is not binding but it is persuasive and will likely carry weight in the European Court’s determination, which is expected in the coming months. Should the European Court follow it, the obligations on social media platforms to locate and remove identical and equivalent content under Court order may be substantially extended. 


For further information or advice, please contact 
Mark O'Shaughnessy or any member of the ByrneWallace Litigation & Dispute Resolution Team.

To register for ByrneWallace updates click here, and follow us on Linkedin



[1] Eva Glawischnig-Piesczek v Facebook Ireland Limited Case C-18/18, Opinion of Advocate General delivered on 4 June 2019

[2]  E-Commerce Directive (2000/31/EC) (the “EU Directive”)

[3] Article 15 provides that “Member States shall not impose a general monitoring obligation on providers…to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.”

[4] See 1 above