‘Inside Facebook’: Dublin’s disturbing role in turning blind eye to hate speech

Review: ‘Secrets of the Social Network’ undercover investigation exposes Facebook’s Dublin content review unit


According to its founder, Mark Zuckerberg, Facebook has one soaring mission: to connect the entire world.

Its business model, however, is a little cruder connecting the entire world to advertisers. This it achieves with a broadly libertarian, only fitfully regulated policy towards the content this world chooses to share, cracking down on images of breasts, say, while leaving hate speech largely unchecked. Even to the casual observer, that has long seemed less like coherent terms of service than the peculiarities of a misfiring algorithm, which is also a polite way of describing Mark Zuckerberg.

Facebook's content scrubbers split hairs over whether material might be considered offensive, using fascinatingly flimsy rationales

In truth, though, Facebook employs people to review whatever content its users report as unacceptable, and provides them with a policy to help decide whether to ignore it, delete it, or mark it as disturbing – thereby preceding it with a warning. That last option is abbreviated as MAD, which makes it more uncanny to hear a distorted but unmistakable Dublin voice refer to “the MAD stuff”– a category that may include a video of a man eating live baby rats, or another man violently kicking a two-year-old boy.

Why is such material not automatically removed, he is asked in Inside Facebook: Secrets of the Social Network (Channel 4, Tuesday, 9pm) an undercover exposé of Facebook’s content review department, based in Dublin. “For better user experience,” he responds in numbed tech jargon. Forget the Nuremberg defence. They are only following policy.

READ MORE

With secret cameras deployed in a corporate training module, Facebook’s content scrubbers split hairs over whether material might be considered offensive, using fascinatingly flimsy rationales. A video of a brutal beating, for instance, may be ignored if it has a caption condemning the beating – because how could anyone extract pleasure from something that has been publicly deplored?

“What Facebook has learned is that the people on the extremes are the really valuable ones,” says Roger McNamee, an early investor now turned sceptic. Disturbing content is more likely to get shared, either in support or dismay, thus reaching more eyeballs, which in turn encounter more advertising.

If Facebook is serious about the 'experience', it seems strange to provide a 'shielded' status to extremely popular users

Richard Allen, Facebook’s VP of global policy, protests that extreme content “is not the experience we’re trying to deliver”, and nobody would doubt him, but the documentary illustrates the grave consequences of Facebook’s own radically desensitised approach.

That video of the physically abused two-year old, for instance, is still online six years later, under the wan defence of “spreading awareness”. (A psychologist roundly dismisses the same argument when used to leave images of self-harm online). The RSPCC’s response is more unanswerable: “That child is being reabused with every click.”

If Facebook is serious about the “experience”, it seems strange to provide a “shielded” status to extremely popular users, such as the notoriously racist Britain First which had close to a million fans before being removed from the platform. “Obviously they had a lot of followers, so they generated a lot of revenue for Facebook,” reasons that Dublin voice again.

Facebook ought to mark those words, and this disquieting programme, as disturbing – the mad stuff born not from some weakly defined concept of freedom of speech, but of a gutless corporate culture that turns a blind eye to the darker parts of the world it connects.

Dislike.