Irish Times view on Facebook’s abusive content: clicks and money trump decency
Children need greater safeguards but Government failed to inspire when it announced its Action Plan for Online Safety
Facebook said no special status was accorded to extremist organisations. Yet this rings hollow. File photograph: Niall Carson/PA Wire
Money matters most for Facebook, even if the route to it means turning a blind eye to disturbing, abusive and violent content on its platform. As a shocking Channel 4 undercover investigation programme graphically showed, the platform not only tolerates but sometimes shields harrowing content. A reporter, posing as a trainee for a Facebook content moderating job, was instructed to allow videos portraying violent child abuse and fights between teens, racist posts, animal abuse, and self-harm pictures. He was also cautioned to ignore content of extremist organisations and individuals.
The reason for such incomprehensibly feeble moderation policies? Some mention was made by trainers and a Facebook executive of not wanting the site to be overly censorious, or to limit freedom of speech. But sadly the truth, as also stated on the programme, is that extreme content encourages more clicks, more sharing of posts and viewing of videos, and thus, more income.
Facebook has insisted many of the items shown violated its standards and should have been removed. It also said no special status was accorded to extremist organisations. Yet this rings hollow. Training materials used by Dublin-based CPL, Facebook’s largest centre for UK content moderation, are surely co-ordinated with the social network. They indicated a broad tolerance for posts, images and videos that do not fit with Facebook chief executive Mark Zuckerberg’s regular insistence that it is a safe community for its users.
If Facebook isn’t inclined to adequately moderate the huge platform and its over two billion users, what can be done? Finding solutions will be challenging. New approaches to regulation – such as deeming the platform a media company, prompting additional legal responsibilities – and creating better laws at national and international level, are one place to start. The EU has already indicated it may use antitrust legislation to limit Facebook’s activities and its size. Europe is also considering legislation requiring greater transparency of algorithms, the highly secret mathematical engines underlying sites like Facebook.
Individuals also have privacy protections, strengthened by the new General Data Protection Regulation, that give greater control of how data is used. The new EU ePrivacy directive, still in draft form, should also bring fresh protections.
But children, in particular, need greater safeguards. Disappointingly, the Government failed to inspire when it announced its Action Plan for Online Safety recently, postponing the enactment of some key promises, such as creating a digital safety commissioner. However, internet activists have noted the State could add social media companies to the list of entities which must report child abuse under the Children First Act 2015. While there are few quick fixes, this would at least be an immediate win for children.