Will Facebook ‘shame’ page case lead to slew of similar lawsuits?
Social media giant named in Belfast case involving 14-year-old girl
A legal case involving a 14-year-old girl and a “shame” page on Facebook has potentially cast the social media giant in a leading role for future legal challenges. Photograph: Cyril Byrne
A legal case involving a 14-year-old girl and a “shame” page on Facebook has potentially cast the social media giant in a leading role for future legal challenges. The case was filed after a naked photograph of the teenager was posted several times on the Facebook page.
Hailed as a landmark case, the firm was explicitly named as partly responsible alongside the individual who posted the photo. The exact details of the settlement are confidential, although the court was told Facebook is paying the teenager’s legal costs.
So is Facebook facing a slew of cases as a result of this action? There may well be further cases filed. But there are some things to bear in mind.
As this case was settled out of court, there is no judgment to pour over, and no legal obligation on Facebook as a result – at least, none that was announced publicly.
This particular case dealt with a photograph that was posted multiple times between 2014 and 2016. Last year, Facebook introduced a tool that would prevent the sharing of “revenge porn” on the site. An image, once reported as revenge porn, is reviewed by a specialist team, is “hashed” so it can be identified, and cannot be uploaded again to the site, or indeed shared through Messenger or Instagram.
For the most part, objectionable content on Facebook is dealt with in a reactionary manner, so you have to report an image or video to kick off the process. Facebook does not scan every piece of content that is uploaded to its platform. The work load alone would be massive, as millions of photographs are uploaded each day and artificial intelligence, while able to relieve some of the burden, is not yet at a stage where it can replace human intervention. Each of the millions of reports Facebook receives are reviewed by a human team.
And, save for cases like this – or worse – would people really be happy with the company doing so? Scanning every public post, maybe; but how comfortable would you feel about every message, every private post being reviewed by human or machine, all in the name of hunting down banned content? While we know, deep down, that we and our data are a product to these services, there is a difference between knowing that our online activities are being noted and having that point rammed home on a regular basis.
None of this is to say that Facebook should be absolved of all responsibility. But this case, while notable, may not bring about the sweeping changes that people hope for.
There are certain things we do know, although they are more common sense and good business practice than anything to do with this particular case. Facebook has a responsibility to users – and non-users – of its network that find their images are being misused on the platform. It has a duty to act when questionable content is posted to protect those users.
However, the nature of social media – instant sharing, global reach, massive user base – make it a particularly effective tool for anyone who wants to spread such material. Add to that a report published yesterday that said 13 per cent of secondary school students have sent a nude/semi-nude photo/video of themselves, and 15 per cent have shared or showed a friend a photo sent by someone else, and it makes for sobering reading.