Facebook looks to AI to monitor harmful behaviours

Social networking company will double the number of staff who work on safety

Facebook’s chief product officer Chris Cox told the firm’s Communities Summit in London that Facebook was keen to answer criticism aimed at its impact on society. Photograph:  Yui Mok/PA Wire

Facebook’s chief product officer Chris Cox told the firm’s Communities Summit in London that Facebook was keen to answer criticism aimed at its impact on society. Photograph: Yui Mok/PA Wire

 

Facebook is exploring ways to help cut down on abuse of its service, with the possibility of bringing in artificial intelligence to pinpoint harmful behaviour in its groups.

Speaking at an interview at the Facebook Communities Summit in London, the social media platform’s head of groups Jennifer Dulski said the company already used AI to monitor content related to terrorism, for example.

“We have a number of AI tools that we already use around areas like terrorism where we remove 99 per cent of all content related to Al Qaeda and Isis before it ever hits the site,” she said. “We’re looking at that in other areas, including child safety and other areas that might be harmful.”

The company also announced it would double to 20,000 the number of people working on safety in its London office.

“We definitely are looking at ways to look at improve the safety on Facebook all the time. This is something that is extremely important to us, it’s why we’re doubling to 20,000 people watching out for bad content that might be there,” Ms Dulski said.

“We’re also looking at ways to build better tools for admins themselves around moderation, so that we can highlight and flag things to them that they might want to pay attention to.”

The summit brought hundreds of administrators of Facebook groups together at its second gathering of community leaders. The company used the summit to announce a new multimillion dollar initiative that will provide grants of up to $1 million for community leader to help develop their groups.

Facebook’s head of product Chris Cox said 2017 had been a tough year for the company. “We faced tough questions about our role in democracy, our role in discourse, our role in journalism, and our role in well-being,” he said.

The company started 2018 with a pledge to fix Facebook, and it has already begun with a change in users’ newsfeeds to concentrate more of personal interactions rather than viral content and business pages.