Era of Facebook self-regulation ‘is over’, says committee chair

Social media firm apologises amid calls for State to take lead on regulation

Facebook head of public policy Niamh Sweeney and Siobhán Cummiskey, head of content policy for Europe, the Middle East and Africa, arrive at the Dáil on Wednesday. Photograph: Nick Bradshaw

Facebook head of public policy Niamh Sweeney and Siobhán Cummiskey, head of content policy for Europe, the Middle East and Africa, arrive at the Dáil on Wednesday. Photograph: Nick Bradshaw

 

The era of self-regulation by social media platforms is over and the Republic must lead the way in deciding how they are regulated in the future, an Oireachtas committee has said.

The Joint Oireachtas Committee on Communications questioned representatives of Facebook on Wednesday in the wake of an undercover television investigation which appeared to show moderators working for a contractor employed by the company deciding not to remove extreme and controversial content.

This included a video of a child being physically abused, first posted six years ago, and a post comparing Muslims to “sponges”.

Facebook representatives told the committee the Channel 4 Dispatches programme aired on July 17th had identified “some areas where we have failed” and it apologised for those failings.

The social media giant said it would “substantially increase the level of oversight” of its training of content moderators. It was carrying out an internal investigation with its Dublin-based contractor, CPL, to establish how the “gaps between our policies and values and the training given by CPL staff came about”.

Niamh Sweeney, head of public policy for Facebook Ireland, said the safety and security of users was a “top priority” for the company and it had created policies, tools and a reporting infrastructure that were designed to protect all its users, especially those who were most vulnerable to attacks online.

While Facebook insisted it was not throwing the contractor “under a bus”, it also said the Facebook slide presentations delivered to content moderators and seen in the undercover filming had been altered by CPL. The social media company said, however, it accepted “full responsibility”.

‘Incapable or unwilling’

Committee member and Sinn Féin TD Brian Stanley said the episode suggested Facebook was “either incapable or unwilling” to deal with the issue of regulation and to stop this kind of content appearing on the platform.

Fianna Fáil TD Timmy Dooley said the disturbing content on the platform was “effectively the cocaine” which attracted users to it and kept them sharing that content.

Fine Gael TD Hildegarde Naughton, chair of the committee, said after the hearing it was the view of the committee that “the time for apologies and remedial action was past”.

“Social media platforms have shown themselves incapable of self-regulation. If they won’t regulate themselves, we must do it for them,” she said.

Ms Naughton said the committee would now prioritise scrutiny of a Digital Safety Commissioner Bill proposed by Deputy Donnchadh Ó Laoghaire of Sinn Féin.

She would write to Taoiseach Leo Varadkar and the Minister for Communications Denis Naughten “indicating the need, in principle, for such legislation”, notwithstanding whatever changes might be warranted after detailed scrutiny by the committee.

Ms Naughton will also seek meetings with the president of the European Parliament Antonio Tajani, and with the EU commissioner for the digital society, Mariya Ivanova Gabriel, to discuss what she said was the need for European regulation.

“As Facebook’s European headquarters is based in Ireland, we must lead the way. The press are regulated. Television and radio is regulated. It is the view of the committee that the time for self-regulation by social media platforms is over,” Ms Naughton said.

Ms Naughton said the committee was “absolutely certain of need to ensure that images of child abuse and other such illegal/graphic activity can no longer be shared on Facebook or other social media platforms”.

“No organisation or platform should be above the law. No organisation or platform has the right to decide how it will conduct itself, no matter the adverse consequences for its users.”