Give me a crash course in . . . what’s going on with Facebook

The social network is back in the public spotlight after a Channel 4 investigation

The post appeared on Facebook after a  wave of vandalism hit people’s vehicles and property in the city of Flin Flon. Photograph: Getty

The post appeared on Facebook after a wave of vandalism hit people’s vehicles and property in the city of Flin Flon. Photograph: Getty

 

What happened with Facebook this week?

Where do you start? This week saw the airing of an episode of the Channel 4 programme Dispatches that offered a peek inside Facebook’s content review department.

Undercover reporters attending training sessions at the moderation centre over a six-week period found graphic videos involving assaults on children, images of self-harm and hate speech were being left on the platform, even after being reviewed by moderators.

The footage showed staff being trained at the Dublin centre being instructed not to remove extreme, abusive or graphic content from Facebook, despite the fact that it violated the community standards laid down by the platform. Among the videos was one of a man kicking and beating a toddler. Another video left up said “Muslim immigrants” should “f*** off back to [their] own country”.

The programme also alleged that moderators were trained to ignore when users appeared to be under 13, the age Facebook sets as a minimum to sign up.

The investigation also claimed certain pages with a lot of followers were given special consideration, and any content that potentially violated policies was referred to the senior “Cross Check” reviewing team.

What do Facebook moderators do?

When content is flagged either by Facebook’s automated systems or by Facebook users, because it potentially goes against the community standards, it is sent to a human moderator for review. These moderators are trained to follow Facebook-defined policies over what to do with the content – ignore it, class it as disturbing, or remove it from the platform. Content marked as disturbing gets a warning applied to it, so the user has to click it to see. The video showing the man stamping on a toddler, for example, was marked as “disturbing”, but was allowed to remain on the site. The moderators in the Channel 4 footage were working with CPL Resources, which has a content moderation contract with Facebook.

Why are moderators so important?

Facebook has more than 2.2 billion active users every month, each of whom can upload content to the platform. About 100 million hours of video are watched every day on Facebook, and more than 350 million photographs are uploaded each day to the platform. That’s a lot of content to keep an eye on.

Facebook has a number of automated systems in place to flag content, including graphic violence and hate speech, but it won’t pick up everything. Context is important, and while the automated systems perform well on flagging spam accounts, Facebook still refers reports of graphic violent content to a human for review.

In the past when it was being tackled over objectionable content on its platform, Facebook made a commitment to increase the number of human moderators it employed. But this video clearly showed the system was falling down somewhere along the way.

What did Facebook say about it?

Facebook said the practices shown in the undercover video were “mistakes” that do not “reflect Facebook’s policies or values”. It also said it had reviewed the issues raised by the report, and fixed the mistakes it found. It also required all trainers in Dublin to do a retraining session, and said it was preparing to do the same globally.

The company also denied suggestions that it was in its commercial interests to turn a blind eye to bad content, and also denied claims that extremist pages with a lot of followers were given special consideration. It pointed to the removal of Britain First’s page in March this year as an example.

What did CPL Resources say?

After initially refusing to comment on “commercial client relationships”, CPL issued a statement saying it took immediate action once it had been made aware of the Channel 4 allegations.

What now?

Facebook is facing increasing calls for regulation if it can’t police its own platform adequately. The Children’s Rights Alliance, the Irish Society for the Prevention of Cruelty to Children and CyberSafe Ireland have all claimed self-regulation for online platforms is insufficient, although Taoiseach Leo Varadkar earlier this week expressed confidence that the tech companies were “attuned” to the issue.

Meanwhile, Minister for Communications Denis Naughten has already met with the company in New York, and expressed his concerns about the incident.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
SUBSCRIBE
GO BACK
Error Image
The account details entered are not currently associated with an Irish Times subscription. Please subscribe to sign in to comment.
Comment Sign In

Forgot password?
The Irish Times Logo
Thank you
You should receive instructions for resetting your password. When you have reset your password, you can Sign In.
The Irish Times Logo
Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.
Screen Name Selection

Hello

Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
SUBSCRIBE
Forgot Password
Please enter your email address so we can send you a link to reset your password.

Sign In

Your Comments
We reserve the right to remove any content at any time from this Community, including without limitation if it violates the Community Standards. We ask that you report content that you in good faith believe violates the above rules by clicking the Flag link next to the offending comment or by filling out this form. New comments are only accepted for 3 days from the date of publication.