Fake news in US election becomes bad press for Facebook

World View: Social media giant faces questions over accountability as editor of feeds

Who do you blame? FBI director James Comey, the US electoral college system, racism, misogyny, Clinton complacency or lack of empathy, Donald Trump 's charm. . . All the above? And how about adding Facebook to the list?

Did the extraordinary online lies and fabrications that Facebook circulated unfiltered, and which populate the online space, contribute significantly to the outcome of the US presidential election? After all, some 44 per cent of US adults get news on the site, and 61 per cent of millennials.

A BuzzFeed News report found that the three large left-wing pages on Facebook published false or misleading information in nearly 20 per cent of posts, while the three big right-wing pages published it 38 per cent of the time.

“The best way to attract and grow an audience for political content on the world’s biggest social network,” the report observed, “is to eschew factual reporting and instead play to partisan biases using false or misleading information that simply tells people what they want to hear.” Social media trades on intensifying prejudices.

READ MORE

BuzzFeed reports that a group of Facebook workers are meeting secretly to co-ordinate a challenge to its benign attitude to fake news.

Not guilty, says Facebook founder Mark Zuckerberg, now at the centre of a growing controversy about whether the social-media site has an obligation to police its output: "The idea that fake news on Facebook . . . influenced the election in any way is a pretty crazy idea."

The same Zuckerberg who boasted his site had persuaded some two million to vote who would not otherwise have done so. No influence at all? And he has begun to acknowledge that Facebook will have to do something about the problem, albeit insisting that 99 per cent “of what people see is authentic”. His solution: a version of Wikipedia-like vetting for online news – “enabling our community to flag hoaxes and fake news”.

Advertising curb

Google and Facebook have also announced that they intend to curb advertising on fake news sites, although they do not say how.

The important subtext to the argument is Facebook’s continued insistence it is not a media organisation, merely a technological platform that others use, and has no real moral or legal responsibility for the content of what its users post. Although it may intervene to take down some of the worst violations of “community standards”, it says, that should not be seen as an admission of an editorial function or censorship. (It recently “mistakenly” removed the Pulitzer Prize-winning photo of a nine-year-old girl fleeing napalm bombs during the Vietnam war for violating nudity standards – “system error”.)

Yet Facebook’s algorithms, which push news stories to those who have expressed particular interests in the past, are editors. Just because an algorithm , not a human being, makes the selection does not mean that an editing/selection process is not under way. Or as Guardian columnist Hannah Jane Parkinson puts it: “All journalism, if it’s a decision of what to publish and not to publish, of what stories are worth pursuing and which aren’t, is, if you want to call it such, censorship.”

Meanwhile, German chancellor Angela Merkel has joined in, taking issue with the so-called neutrality of algorithms by calling on major platforms to divulge their formulae , arguing that their lack of transparency endangers debating culture. She told a media conference last week: "I'm of the opinion that algorithms must be made more transparent, so that one can inform oneself as an interested citizen about questions like: 'What influences my behaviour on the internet and that of others?'" Criticising the self-reinforcing echo chamber effect of Facebook algorithms, she warned that a healthy democracy was dependent on people being confronted by opposing ideas.

Emily Bell of Columbia’s digital journalism centre argues that Zuckerberg must intervene more. “By acknowledging that Facebook can and should play a more active part in editing – yes, editing – its own platform, and hiring actual people to do so, Zuckerberg will further the civic commons as well as address a growing problem of how people perceive Facebook.”

Code of conduct

He will also end up acknowledging that Facebook is a media organisation with all the social responsibilities that should go with that. How about, for example accepting the disciplines of the Press Council and its code of conduct?

In Europe the courts (Delfi, ECHR) have sought unconvincingly to distinguish the legal liability for published material between newspaper and social media sites on the basis that the former, supposedly, have a commercial interest in published content, while the latter do not. The court found a newspaper, but not a social media site, was liable for defamation on its site.

“But if it brings in revenues from advertising, then doesn’t that make it a media company too?” observes a leading technology writer who notes that two-thirds of Facebook users use it as a source of news – Facebook is bleeding the newspaper advertising market dry. Why should it then also enjoy legal immunities?