EU wants internet giants to remove extremist content in an hour
Commission issues series of recommendations on illegal material for online platforms
The procedures are aimed at ensuring faster detection of content while still ensuring fundamental rights such as freedom of expression are retained
The European Commission has announced plans to step up its response to tackling illegal online content such as child sexual abuse material and hate speech.
It has issued a series of non-binding recommendations for procedures to be taken by online platforms such as Facebook and Google and member states to flag and remove illegal material. This also covers terrorist content, incitement to violence, counterfeit products, and copyright infringement.
The procedures are aimed at ensuring faster detection of content while still ensuring fundamental rights such as freedom of expression are retained.
The new operational measures are particularly focused on terrorist content with a one-hour rule coming into force that calls on internet companies to remove extremist material within sixty minutes of being alerted to it.
In addition, online platforms are being asked to implement more proactive measures such as deploying automated detection tools, and to regularly report to the authorities on what steps they are taking to curb such content.
The new steps also cover an improved system that will see fast-track procedures being put in place to process referrals quickly.
Moves to tackle illegal content were first addressed last September when the Commission promised to monitor progress in tackling it and also assessing whether possible legislative measures might be necessary.
The Commission said it continues to look at introducing legislation to aid authorities in rooting out illegal online content.
“Online platforms are becoming people’s main gateway to information, so they have a responsibility to provide a secure environment for their users. What is illegal offline is also illegal online,” said vice-president for the digital single market Andrus Ansip.
“While several platforms have been removing more illegal content than ever before - showing that self-regulation can work - we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens’ security, safety and fundamental rights,” he added.