Handling of Kriégel case by social media giants raises questions

Response demonstrates self-regulation versus legal-regulation dynamic at work

The dramatic injunction against Facebook and Twitter granted last Wednesday was rapidly amended by the court on Thursday. Mr Justice Michael White removed from the injunction the part that constituted a prior-restraint order requiring the social media giants to prevent, in advance of posting, identification of the two boys convicted of murdering Ana Kriégel. The part of the order requiring those companies to remove such postings once they became aware of them remained in place. And in that nuanced amendment lies the nub of an argument taking place worldwide about the regulation of social media.

Mr Justice White wisely sidestepped making any ruling on whether Facebook and Twitter should be regarded as “publishers”, in the traditional sense of being editorially responsible for content posted by users on their online platforms. If he had engaged in that deliberation, any judgment he gave would most certainly have ended up in the Court of Appeal and most likely in the Supreme Court also. He – quite rightly in my opinion – left that argument to take place in another court on another day.

There are implications for freedom of speech and "internet freedom" rights if laws are passed holding online platforms absolutely responsible for the content posted

The concept of treating Facebook and Twitter as “publishers” who can be held liable for the content that appears on their platforms is one resisted vigorously by those companies at every opportunity. Facebook chief executive Mark Zuckerberg has repeatedly referred to Facebook as “a tech company” and not a “media company”, an argument somewhat disingenuous when Facebook is the highest-used social media source of news in Europe.


The concept of ‘pre-moderating’ any content and preventing the upload of content to social media platforms is anathema to the ethos – not to mention the lucrative business model – of social media companies. The calls to hold social media companies responsible to take steps to try to prevent certain content – such as child pornography – appearing on their services does not meet any significant resistance. There are, however, genuine implications for freedom of speech and so-called "internet freedom" rights if laws are passed holding online platforms absolutely responsible for the content posted on those platforms.


In 2011, Wikipedia, along with other internet-based services, closed down for a day in protest against proposed US laws aimed at protecting intellectual property online. The proposed laws included measures restricting online access for IP-infringing websites to their web-based payment processors, and obstructing their web browser visibility. A similar one-day shutdown followed the passing earlier this year of the new European Union Copyright Directive, which imposes far higher obligations than previously applied to internet platforms to prevent and manage copyright infringement on their services.

It's in the commercial interests of tech companies providing gigantic "noticeboards" on which any of us can post content, to show they are constantly striving to stay ahead of the curve

The best analogy for examining and evaluating what level of responsibility should be imposed on social media companies for the content published on their platforms probably lies in an otherwise obscure defamation case brought by a disgruntled Mr Byrne against the owners of his golf club more than 80 years ago.

A notice containing “defamatory doggerel” about Mr Byrne had been pinned anonymously on to the clubhouse noticeboard. Mr Byrne sued the proprietors of the golf club, who were held liable for defaming Mr Byrne. The court’s reasoning was that, while the proprietors were correct in saying that another anonymous individual was the original publisher of the notice, the club proprietors’ failure to remove the notice after they became aware of it rendered them liable for the contents of the notice.

The otherwise long-forgotten case of Byrne v Deane has been referenced directly in more than one high-profile case in the UK where judges have wrangled with this problem of whether a social media company should be regarded as "a publisher" for the purposes of a defamation claim. In 2011 (Darsoon v Habeeb and Ors), Google Inc's Blogger. com service was described aptly in the English high court as "analogous to a gigantic noticeboard".

Moderation criteria

Let us imagine that before any posting we made appeared on our Facebook timeline or Twitter account, it was subjected to scrutiny by some form of “net-nanny” algorithm, or an unknown employee or contractor in one of those companies, to evaluate whether it was appropriate or suitable for posting. For so long as that net nanny was responding to mediation criteria to which our society generally subscribes – for example, videos of extremist violence or child pornography – we probably wouldn’t take issue with it.

But what if commercial or operational considerations started to influence a tech company's pre-posting moderation criteria? Considerations such as, let us speculate only, whether the Chinese government would be willing to let Facebook provide its services in China subject to certain restrictions, or whether Google is willing to share its algorithm technology with the US military? Were such factors to start influencing a tech giant's decision about what we could and couldn't post on their platforms, we would justifiably have concerns over shadowy restrictions being imposed on our rights to freedom of expression.

In court on Thursday, Facebook went to great lengths to outline the measures it had taken to be ready to address potential breaches on its platform of the Kriégel trial judge’s order that the two accused, now convicted, teenagers’ identities must not be published. The fact that Facebook went to these lengths, despite maintaining that it is not a publisher, demonstrates the self-regulation versus legal-regulation dynamic at work here.

It’s in the commercial interests of Facebook and Twitter, and other tech companies providing gigantic “noticeboards” on which any of us can post content, to show they are constantly striving to stay ahead of the curve; to show they’re managing potential abuse of their services. This is the only way for them to avoid the intensification of calls for legislation to regulate their activities. It’s up to the social-media companies to prove they can do this quickly and effectively. There are major and legitimate question marks over whether they did so in the Kriégel case.

Andrea Martin is a partner at MediaLawyer Solicitors