Russia sought to use social media to influence EU vote, report finds
Review finds Russian-linked groups and nonstate actors spread disinformation in EU
European authorities blamed Russian groups on Friday for disinformation campaigns designed to depress turnout and sway public opinion in last month’s European Union elections. File Photograph: Vincent Kessler/Reuters
European authorities blamed Russian groups on Friday for disinformation campaigns designed to depress turnout and sway public opinion in last month’s European Union elections.
The preliminary review by the European Commission and the bloc’s foreign policy and security arm found that Russian-linked groups and other nonstate actors had worked to undermine credibility in the European Union through Facebook, Twitter and YouTube.
Officials said new regulations might be needed to force internet platforms to do more to stop the spread of deliberately false information.
“The evidence collected revealed a continued and sustained disinformation activity by Russian sources aiming to suppress turnout and influence voter preferences,” the report said. The report was the first official substantiation by the European Commission of the role that Russians and other groups played in disinformation in the May elections, which many investigators, academics and advocacy groups had warned about. It was a reminder of how active Russians and others continue to be in spreading divisive content online to inflame and stoke electorates all over the world, a strategy that the Kremlin had pioneered in the 2016 US presidential election.
Since then, Facebook, Twitter and others have vowed to clamp down on foreign interference and have worked on new technology and other methods to stop outside meddling during elections. But the report Friday highlighted how much work the platforms still needed to do to stay a step ahead of disinformation networks. The report also has implications for American officials ahead of the 2020 presidential election, with an increasing number of smaller, harder-to-detect domestic groups adopting Russia-like strategies to influence voters.
“The genie’s out of the bottle,” said Ben Nimmo, a senior fellow at the Atlantic Council’s Digital Forensic Research Lab who has been tracking disinformation efforts in Europe. “What we’ve seen over the past few years is an increasing number of actors, both state and nonstate, using similar methods online to interfere in democratic processes.”
European officials did not draw a direct link in the report between the disinformation campaigns and the Kremlin or provide details about what groups in Russia or elsewhere were behind the efforts. The report also stopped short of assessing whether the tactics had an effect on how people voted, with turnout in the elections having hit record levels. The report largely cited the findings of outside researchers who had been tracking the European elections.
Yet European officials said the report was significant because it highlighted the “new normal” of disinformation campaigns. “There was no Big Bang moment. There was no new Facebook-Cambridge Analytica case that we know of,” Vera Jourova, a European commissioner, said during a news conference in Brussels. Yet “the European elections were not free of disinformation.” She added that the continued online meddling was “something we cannot accept.”
Facebook said it had taken steps to protect the integrity of the European elections, including entering into partnerships with local fact-checking organizations, adopting new rules to show who was buying political ads on its platform and dedicating teams of employees to monitor election interference.
“The fight against false news will never be over,” the Silicon Valley company said in a statement in response to the report. “That is why we are making significant investments to remove fake accounts and clickbait and to promote high-quality journalism and news literacy.”
Twitter and Google did not respond to requests for comment. Independent investigators had long warned that Europe was vulnerable to disinformation campaigns ahead of last month’s vote. But eradicating disinformation campaigns was tricky in the elections, which were spread across 28 countries and 24 official languages.
US presidential campaign 2020
In the run-up to the voting, researchers highlighted efforts by Russia-linked groups and those in favour of far-right policies to use Facebook and Twitter to spread false information and exaggerate political divisions. In particular, they identified hundreds of Facebook and Twitter accounts peddling disinformation, more than 1,000 examples of WhatsApp messages sharing suspicious materials and a mix of suspicious websites that spread varying degrees of misleading information– often taking advantage of local political divisions.
According to Friday’s report, Facebook blocked more than 1,700 pages, groups and accounts engaged in inauthentic behaviour targeting EU countries during the first three months of 2019. Voters in Britain, France, Germany, Italy, Poland and Spain were among those targeted. Jourova said Russian meddling had been detected in 1,000 cases since January.
US intelligence officials have warned that the 2020 campaign will also be targeted by foreign groups. In January, the Worldwide Threat Assessment written by government intelligence agencies said Russia would continue to use social media to amplify social and racial tensions in an effort to influence policy and elections.
The European Commission report said new internet regulations might be needed, rather than a reliance on the companies to abide by a voluntary code of conduct. The commission said it would complete a review by the end of the year that could result in new laws.
“More needs to be done by the platforms to effectively tackle disinformation,” the report said. Last month’s vote was seen as a referendum on Europe’s direction. On one side were nationalist and populist groups skeptical of the European Union’s influence on national affairs; on the other were those seeking more integration and cooperation. The results were mixed, with far-right groups performing well in some countries and liberal parties doing better in others.
‘The new normal’
The election demonstrated a shift in disinformation strategy. The report said the efforts were smaller and more localised than Russia’s widespread effort during the 2016 US campaign. Far-right groups and other nonstate actors have also adopted the techniques, the report found.
Nimmo said governments would continue to find it difficult to stop groups committed to using online platforms to spread disinformation and sow discord. He said that in addition to focusing on big companies such as Facebook and Twitter, authorities should scrutinize smaller sites such as Gab and Parler.
“The commission says that this sort of behaviour should not become the new normal,” he said. “That’s right, but hopelessly optimistic.” One emerging challenge for governments and social media platforms is that groups are not sharing outright false information, making the content harder to detect and remove. Instead, social media posts tend to take highly politicised views on news events of the day, such as immigration.
The report pointed to stories that said last month’s collapse of the government in Austria, which was a real event, was the result of the “European deep state.” Other posts said the recent Notre Dame cathedral fire in Paris occurred because of a decline of Western and Christian values. –New York Times