EU to give tech giants just 60 minutes to take terrorist content offline

Companies who fail to comply with takedown orders to face significant fines

Some 700 new pieces of official propaganda from Islamic State, were disseminated online in January alone, the European Commission said. Photograph: Reuters

Some 700 new pieces of official propaganda from Islamic State, were disseminated online in January alone, the European Commission said. Photograph: Reuters

 

Technology giants such as Google, Facebook and Twitter will have just one hour to take down terrorist propaganda from the internet or face substantial financial penalties under new rules proposed by the European Commission.

The commission has said technology companies have a duty of care to ensure their platforms are not misused by terrorist groups such as Islamic State (also known as Isis and Da’esh) to easily spread content online.

Under the new rules, tech giants will be ordered to remove material that “incites or advocates committing terrorist offences, promotes the activities of a terrorist group or provides instruction in techniques for committing terrorist offences”.

The proposal calls for “strong and deterrent” financial penalties of up to 4 per cent of a company’s annual turnover for failing to comply with the new takedown rules.

The plan was confirmed by commission president Jean-Claude Juncker during his state-of-the-union address on Wednesday.

According to the commission, terrorist content continues to thrive online despite concerted efforts to stamp it out.

The commission has been working with technology companies on a voluntary basis over the last few years to ensure removal of content that promotes terrorism. However, according to its figures, 700 new pieces of official propaganda from Islamic State, were disseminated online in January alone.

“Many of the recent attacks in the EU have shown how terrorists misuse the internet to spread their messages. Today we say ‘no more’ to this misuse of the internet,” said commissioner for migration, home affairs and citizenship Dimitris Avramopoulos.

Speed of dissemination

The commission claims terrorist content is most dangerous in the hours after it first appears online because of the speed at which it spreads across social media.

The new legally-binding one-hour deadline calls for technology companies to swiftly remove offending material within 60 minutes of receiving a removal order from authorities. Depending on the risk of terrorist content being disseminated, service providers will also be required to take proactive measures – such as the use of new tools – to better protect their platforms and users from terrorist abuse.

A framework is also planned to increase co-operation between tech companies, member states and Europol to ensure the new rules are feasible. This would include introducing effective complaint mechanisms and judicial remedies.

For platforms making use of automated detection tools, human oversight and verification should be in place to prevent erroneous removals, the commission said.

It also calls for annual transparency reports from both companies and member states on how they tackle terrorist propaganda and how successful these approaches are.

“This regulation is a response to citizens’ concerns. We propose specific rules for terrorism content which is particularly harmful for our security and for trust in the digital. What is illegal offline is also illegal online. The EU continues to stay engaged in order to build a safer, human-centric internet based on our values,” said commissioner for digital economy and society Mariya Gabriel.

Earlier this year, the commission announced plans to step up its response to tackling illegal online content such as terrorist propaganda, child sexual abuse material and hate speech. In March, it issued a series of non-binding recommendations for online platforms and member states to flag and remove illegal content to ensure faster detection of offensive material.

According to the commission, most of the social media giants have increased their monitoring of offensive material since signing up to its code of conduct in 2016. However, vice-president for the digital single market Andrus Ansip said that while self-regulation was working, takedown rates could still be improved.