Social media firms should self-regulate on fake news, says EU report
Report is part of commission’s consultation process on controlling ‘disinformation’
Dutch professor Madeleine de Cock Buning welcomed the extent to which the giant social media companies such as Facebook, Google, and Twitter engaged with and supported the work. Photograph: Yui Mok/PA Wire
Social media organisations like Facebook and Google should engage with a new system of international self-regulation to fight the proliferation of fake news, an independent report for the European Commission has recommended.
All the relevant stakeholders, from the public, to journalists, the media industry, factcheckers, and the ad industry, should be involved in drawing up and monitoring the enforcement of a code of conduct for social media that reflects the need to protect both freedom of expression and transparency.
In the longer term the state should support increased funding for quality journalism, the report argues.
The report, A multi-dimensional approach to disinformation, was chaired by Dutch professor Madeleine de Cock Buning, who expressed surprise and welcome at the extent to which the giant social media companies such as Facebook, Google, and Twitter represented on the 32-member committee engaged with and supported its work.
They have also in recent months introduced new internal mechanisms for rapidly taking down notified hate speech, which the commission suggests should be done within 24 hours.
The report, which eschews the expression “fake news”, defines “disinformation” as reports which include all kinds of false information designed, presented and promoted to intentionally cause public harm or for profit. It does not deal with hate speech or defamation, both of which are prohibited by law, and which it says raise other issues.
Its findings were unanimously agreed with the exception of the European consumer organisation Beuc, which favours a tougher mandatory approach to the responsibilities of social media companies.
Among other recommendations, it suggests that social media companies should share the controversial algorithms they use to rank and post stories in “safe spaces” where independent academics could assess them. It also argues that they should clearly identify stories and material which is paid-for content, and use “source transparency indicators” to assist users in identifying dubious content.
The report is part of the commission’s consultation process on controlling online disinformation and it is preparing an action plan for April.
On Monday, commissioner for the digital economy Mariya Gabriel welcomed the findings, which she said would inform the commission’s approach, citing new Eurobarometer figures which she said highlighted the urgent need and support for action.
Some 87 per cent of those surveyed throughout the EU admitted to having been confronted by fake news, with 70 per cent viewing it as a problem for democracy. While 70 per cent expressed confidence in news from traditional outlets, only 25 had confidence in what they read online.
Self-regulation should go hand in hand with the strong promotion of media diversity and training for online literacy, the report says.