Facebook restricts livestreaming, citing New Zealand shooting
Taoiseach Leo Varadkar to attend summit on tackling online terrorism in Paris
Facebook said in a statement it was introducing a ‘one-strike’ policy for use of Facebook Live.
A lone gunman killed 51 people at two mosques in the city of Christchurch on March 15 while livestreaming the attacks on Facebook. It was New Zealand’s worst peacetime shooting and spurred calls for tech companies to do more to combat extremism on their services.
Facebook said in a statement it was introducing a “one-strike” policy for use of Facebook Live, temporarily restricting access for people who have faced disciplinary action for breaking the company’s most serious rules anywhere on its site.
First-time offenders will be suspended from using Live for set periods of time, the company said. It is also broadening the range of offences that will qualify for one-strike suspensions.
New Zealand prime minister Jacinda Ardern said the change addressed a key component of an initiative, known as the “Christchurch Call”, she is spearheading to halt the spread of violence online.
“Facebook’s decision to put limits on live streaming is a good first step to restrict the application being used as a tool for terrorists, and shows the Christchurch Call is being acted on,” she said in a statement.
Facebook did not specify which offences were eligible for the one-strike policy or how long suspensions would last, but a spokeswoman said it would not have been possible for the shooter to use Live on his account under the new rules.
The company said it plans to extend the restrictions to other areas over coming weeks, beginning with preventing the same people from creating ads on Facebook.
It also said it would fund research at three universities on techniques to detect manipulated media, which Facebook’s systems struggled to spot in the aftermath of the attack.
‘Work to do’
Ms Ardern said the research was welcome and that edited and manipulated videos of the March 15th mosque shootings had been slow to be removed, resulting in many, including herself, seeing it played in Facebook feeds.
Facebook has said it removed 1.5 million videos globally that contained footage of the attack in the first 24 hours after it occurred. It said in a blog post in late March that it had identified more than 900 different versions of the video.
Ms Ardern is due to lead a meeting, with French President Emmanuel Macron in Paris on Wednesday, that seeks to have world leaders and chiefs of tech companies sign a pledge to eliminate violent content online. Taoiseach Leo Varadkar is among those due to attend.
“There is a lot more work to do, but I am pleased Facebook have taken additional steps today alongside the Call and look forward to a long term collaboration to make social media safer by removing terrorist content from it,” Ms Ardern said.
Representatives from Facebook, Alphabet Inc’s Google, Twitter and other tech companies are expected to take part in the meeting, although Facebook chief executive Mark Zuckerberg will not be in attendance. - Reuters