Facebook assumes role of super-editor with ‘Napalm girl’ deletions

Treatment of Vietnam War photograph highlights social network’s censorship issues

Facebook made multiple deletions of Nick Ut’s photograph of nine-year-old Kim Phuc, the naked “napalm girl” pictured fleeing an attack in 1972 near Trang Bang in southeast Vietnam. Photograph: Nick Ut/AP Photo

Dear Mark Zuckerberg. Only joking. Tempting though it often seems, I have no plans to write an open letter to Mark Zuckerberg.

One person who wrote one is Espen Egil Hansen, editor-in-chief of Norwegian newspaper Aftenposten, and there was one line in it that couldn't have loomed any larger had it popped up as a full-screen, all-caps push notification: "Editors cannot live with you, Mark, as a master editor."

The single sentence pithily encapsulates important points about censorship, algorithms and all-powerful platforms, and gifts us with the surely catching moniker "master editor". But as a reflection of how most media organisations engage with Facebook today, it works less well.

Individual editors may not be able to live with Zuckerberg as a master editor, but as a collective group, it does very much seem like they’re trying their best to move into his house. They’re knocking on his door and shouting through the letterbox: “Why have you changed the locks again, Mark?”


And when Zuckerberg’s robot butler, the one he’s building to look after his child, opens the hatch and reveals the rich digital paradise that gleams inside, they’re plunging head-first into the hallway, plonking their tiny foldaway beds in the massive kitchen and saying: “We swear, Mark, you won’t even know we’re here.”

‘Napalm girl’

The context to the unexpected chilling of relations between Norway and Facebook, if you missed it, is the Vietnam War. Hansen wrote his open letter because he was among the many who were perturbed by Facebook's multiple deletions of Nick Ut's famous photograph of nine-year-old Kim Phuc, the naked "napalm girl" pictured fleeing an attack in terror and pain.

The original censorship applied to Norwegian author Tom Egeland, who had posted about seven photographs that changed the history of warfare. Facebook removed the photograph of Kim Phuc on the grounds that her nakedness did not comply with its "community standards".

When Egeland complained, his account was suspended. Phuc said she was saddened that the focus in Ut's "moment of truth" was now on nudity, "rather than the powerful message it conveys", while Aftenposten also had a discussion post deleted from its Facebook page. When Norwegian prime minister Erna Solberg and half her cabinet shared the image, those posts, too, were removed.

Eventually, Facebook woke up and backed down, reinstating the posts due to the image’s “historical importance”. But one of the properly worrying dimensions to the incident is why it took Facebook so long to change tack.

Accusations earlier this year that it displayed an anti-conservative bias in its “trending topics” feature may have contributed to its self-defeating reluctance on this occasion to use human eyes – and good corporate sense – to distinguish between “napalm girl” and child pornography. Facebook made the trending topics dispute go away by firing the team who selected stories for it, and the tool is now controlled by an algorithm with a penchant for hoaxes and conspiracies.

Algorithms are proffered by Facebook as evidence that it is not an editor and definitely does not take editorial decisions that can then be criticised. It denies having the power that Hansen and others say it has. If the censorship of “napalm girl” achieves anything, it will be to provide a solid case study to those who believe Facebook should acknowledge that it acts precisely like an editor, and a super-editor at that.

News feed

Coincidentally, Facebook was last week celebrating the 10th anniversary of its news feed, the feature that makes Facebook what it is today – an updating scroll of posts from friends, friends of friends, pages we liked, pages our friends liked and advertisers (in about that order of interest).

The news feed is also governed by algorithm, of course, and just one of the ways it works is to give greater prominence to the type of posts we "like". The Onion's headline "Horrible Facebook Algorithm Accident Results in Exposure to New Ideas" hints at why this might pose an issue for serious news media trying to distribute stories on Facebook, which prioritises posts from friends and family anyway.

For sure, it is difficult to dispense with Facebook. On a wet Tuesday night, I’ll cheerfully live vicariously through 88 holiday pictures posted by my happiest friend. It makes a nice change from crying into my carbonara about Facebook’s surging share of advertising revenue, what this ultimately means for jobs in journalism, and the idea that it is master editor of our lives because we have let it be.

One traditional power of the editor – placement – certainly does not translate very well. The social network not only decides whether stories comply with catch-all “community standards” and are allowed to appear, its algorithm decides upon the order and context in which they surface.

Hansen wants Zuckerberg to be more accessible, less authoritarian and more discerning. Who could disagree? But another of his conclusions, that Facebook “should distinguish between editors and other Facebook users”, reads like wishful thinking, and a little bit, well, like an old-media thing to say.