Facebook loses final appeal in defamation takedown case, must remove same and similar hate posts globally – TechCrunch
Austria’s Supreme Court has dismissed Facebook’s appeal in a long running speech takedown case — ruling it must remove references to defamatory comments made about a local politician worldwide for as long as the injunction lasts.
We’ve reached out to Facebook for comment on the ruling.
Green Party politician, Eva Glawischnig, successfully sued the social media giant seeking removal of defamatory comments made about her by a user of its platform after Facebook had refused to take down abusive postings which referred to her as a “lousy traitor”, a “corrupt tramp” and a member of a “fascist party”.
After a preliminary injunction in 2016 Glawischnig won local removal of the defamatory postings the next year but continued her legal fight — pushing for similar postings to be removed and take downs to also be global.
Questions were referred up to the EU’s Court of Justice. And in a key judgement last year the CJEU decided platforms can be instructed to hunt for and remove illegal speech worldwide without falling foul of European rules that preclude platforms from being saddled with a “general content monitoring obligation”. Today’s Austrian Supreme Court ruling flows naturally from that.
Austrian newspaper Der Standard reports that the court confirmed the injunction applies worldwide, both to identical postings or those that carry the same essential meaning as the original defamatory posting.
It said the Austrian court argues that EU Member States and civil courts can require platforms like Facebook to monitor content in “specific cases” — such as when a court has identified user content as unlawful and “specific information” about it — in order to prevent content that’s been judged to be illegal from being reproduced and shared by another user of the network at a later point in time with the overarching aim of preventing future violations.
The case has important implications for the limitations of online speech.
Regional lawmakers are also working on updating digital liability regulations. Commission lawmakers have said they want to force platforms to take more responsibility for the content they fence and monetize — fuelled by concerns about the impact of online hate speech, terrorist content and divisive disinformation.
A long-standing EU rule, prohibiting Member States from putting a general content monitoring obligation on platforms, limits how they can be forced to censor speech. But the CJEU ruling has opened the door to bounded monitoring of speech — in instances where it’s been judged to be illegal — and that in turn may influence the policy substance of the Digital Services Act which the Commission is due to publish in draft early next month.
In a reaction to last year’s CJEU ruling, Facebook argued it “opens the door to obligations being imposed on internet companies to proactively monitor content and then interpret if it is ‘equivalent’ to content that has been found to be illegal”.
“In order to get this right national courts will have to set out very clear definitions on what ‘identical’ and ‘equivalent’ means in practice. We hope the courts take a proportionate and measured approach, to avoid having a chilling effect on freedom of expression,” it added.