How fake news is destroying transparency on the internet

Spread the propaganda and hashtag flooding have damaged utopian ideas about public benefit of web


Many of the early, optimistic assumptions about how the internet would create a public sphere with greater openness, transparency and accuracy have been battered by how it has actually been used and abused, according to Frank Pasquale, professor of law at the University of Maryland.

During a talk, The Automated Public Sphere, last week at Berlin's digital culture festival Re:publica, Pasquale said that fake news stories, the spread of propaganda, secret sponsors behind what we see and read, and hashtag flooding (using hashtags to flood searches on a topic) had all damaged utopian ideas about the public benefits of the internet.

“We were told the internet would empower everyone and reduce the dominance of mainstream media, but it has also encouraged extremism,” he says. “It promised openness, but lets influence go unchecked and unmonitored” because it is difficult to figure out who is actually funding and supporting many websites.

He also notes that academic researchers have established that tens of thousands of posts to social-media sites during the American presidential election came from automated bots. Hundreds of fake stories were shared.

READ MORE

The use of so-called "dark ads" (promoted but unpublished posts, visible only to followers of a Facebook page) and ad personalisation directly targeted certain types of content to those "who are most susceptible to it" because such content and news stories appeared in an individual's personal news feed – where a wider community could neither see the posts nor refute them.

“So, you can accelerate propaganda, as well as accelerate truth,” he says.

Ridiculous to most

While some fake news stories rightly appear ridiculous to most – such as the "pizzagate" story that claimed Hillary Clinton and others were secretly running a child-abuse ring in the back of a popular Washington DC pizza restaurant – there are susceptible audiences.

“We have to worry a great deal about floating voters, low education voters, voters at the edge of the political process being susceptible,” he says. Sometimes the goal is simply to create doubt, so that some voters never vote at all.

On Facebook, “everything looks the same, and appears with the same level of authority” on a newsfeed, but this provides a “debased egalitarianism”. And there’s little incentive for tech giants to rush to address the problem. Profits are linked to the proliferation of fake news: “Sensational lies and outrage cycles promote user engagement.”

Rhetoric in the US – again, largely from the tech industry – about the need to have one “unbroken”, unified internet deters addressing the problem. The common assertion is that introducing regulation and oversight means “you will break the internet”, Pasquale says.

And deregulation "is very disingenuous because deregulation is a lie. You essentially cede power to massive corporations to be de facto regulators. Facebook and Google are effectively the regulators but acting in ways without public accountability."

He is also “very worried that the US-centric view of the world is overinfluencing international bodies, when culturally specific models are needed.”

What could be done to improve the situation?

Labelling, monitoring and explaining hate-driven, biased search results is Pasquale’s first suggestion. He points to the rise of alt-right and Holocaust revisionism, the expert gaming of search results by the extreme right so that a Google search on topics such as the Holocaust returns false stories as top results.

Autocomplete bias, where a partially typed in search query returns hate-driven suggestions for completing the search phrase, is also an issue. “Whole racial groups have hate-filled autocompletes,” he said.

White supremacy

After Charleston church shooter Dylann Roof had googled subjects such as "black on white crime" and seen racist autocompletes and fake news that suggested "white genocide" was a possibility, he murdered nine people in a historically black church. And the fake pizzagate story reportedly originated in a white supremacist's tweet.

To address the problem, automatic logs could reveal where such information originates, which also could be crucial to rapid responses and information take-downs by tech companies – before a gunman goes to a location, as happened after the proliferation of the pizzagate story.

“We should require immutable audit logs,” Pasquale says. Silicon Valley experts say the internet is now so complicated that laws can never catch up, “but I think it’s very important . . . that we push back immediately. We cannot adopt this condescending mode that the coding sphere is too complicated for the rest of us to understand. We can at least have logs of the data that are influencing certain results on Google and could help identify certain sources of information.”

Google continues to maintain it doesn’t want “human interference with its algorithms”, he says. “So often in tech companies, anything involving a human is seen as a defect.”

But Pasquale says restoring human editors is also “an inevitable part of this process. Failure to keep human editors has led to the proliferation of fake news. We have to restore the integrity of journalism as a profession, not merely as a source of piecework, propaganda and PR.”

He says we need some sort of analysis and labelling process for data, which shouldn’t be that alien a concept. “We already accept labelling of drugs and food,” he says.

“What we need as a second step in the information economy is, we need to have information about the information we get.” That would help us decide what news we are going to trust, and what feeds are we going to follow. Too often, technology giants use assertions of “trade secrecy” to block offering any transparency about their algorithms and what they do.

Entities – meaning news organisations and websites as well as search platforms – have to recognise people may only be reading the headlines on search returns, which may imply a fake story is a real story. Even a story-verification website such as Snopes contributes to this problem because a search produces a result that is just a headline that states the fake story. This may then be shown to be false within a Snopes article, but many people never read the story.

The public probably need intermediaries – sites such as Google and Facebook, which carry the stories and headlines and links – to assess and address such problems, “and some of the funding for such an exercise probably has to come from intermediaries themselves”, especially as more and more revenue goes to intermediaries, not the media organisations supplying the news they link to.

Right to be forgotten

He thinks further development of a principle such as the EU’s “right to be forgotten” – to have links to irrelevant or outdated information removed from search engines – has a place, too.

“I know it is controversial and that there are concerns in the press” about links being removed to factual stories or information, but he thinks that in individual cases “a humane, compassionate response . . . is for the intermediary to intervene”.

Many people don’t understand the difference between sponsored content and organic content. These basic levels of media literacy should be taught in school to children from about age nine on, Pasquale believes.

And just addressing the “filter bubble” – where people increasingly only see news that aligns with and reinforces their existing point of view, thanks to social media and search algorithms – doesn’t work.

He points to factually misleading tweets from Fox News (although he notes something similarly misleading could come from left-wing news sources) and says offering such posts isn’t a proper counterbalance.

“The problem here is [that] modulation presumes rational deliberation.”

He argues that his suggested modes of regulation “would enable us to move a bit closer to a better modulated, more legitimate public sphere”.

He adds: “It’s really a plea for structure.” The current process offers little structure and is instead one of “extreme corporate power”.