Instagram bans graphic images of self-harm to curb impact on teenagers

Head of social network says more needs to be done to protect young peoples’ mental health

The photo-sharing platform is banning graphic images of self-harm after UK health secretary Matt Hancock said social media companies ‘need to do more’ to curb their impact on teenagers’ mental health Photograph: Yui Mok/PA Wire

The photo-sharing platform is banning graphic images of self-harm after UK health secretary Matt Hancock said social media companies ‘need to do more’ to curb their impact on teenagers’ mental health Photograph: Yui Mok/PA Wire

 

Instagram is banning graphic images of self-harm after UK health secretary Matt Hancock said social media companies “need to do more” to curb their impact on teenagers’ mental health.

The photo-sharing platform announced a series of changes to its content rules including a ban on graphic images of self-harm and the removal of non-graphic images of self-harm from searches, hashtags, and the explore tab.

Instagram said it will not be entirely removing non-graphic self-harm content, as it does not “want to stigmatise or isolate people who may be in distress”.

Head of the social network, Adam Mosseri, said: “Nothing is more important to me than the safety of the people who use Instagram. We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community.

“I have a responsibility to get this right. We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they’re most in need.”

‘Important change’

Yesterday afternoon the site’s boss met Mr Hancock along with representatives from Facebook, Snapchat, Twitter, Pinterest, TechUK, Samaritans, Internet Association UK, and Google, to discuss content on suicide and self-harm.

After the meeting, Mr Hancock said: “This is an important change, there’s lots more things that we need to see, like transparency over how much damaging material there is and also I want to see when people search for material about suicide and self harm, that they are directed better to help, to the Samaritans, to the NSPCC. We’ve seen today progress, the discussions were productive, and the willingness to try and solve this problem.”

Molly Russell died in 2017 aged 14. Her family found material relating to depression and suicide when they looked at her Instagram account after her death.

‘Modern nightmare’

Mr Hancock added: “What happened to Molly Russell is every parent’s modern nightmare. I am glad Instagram have committed to me that they will now take down graphic self-harm and suicide content.

“This is best delivered in partnership, because it’s the social media companies who understand their own platforms. We made some progress today in terms of taking down some of the most egregious, harmful material that promotes suicide and self harm.

“But there’s more to do in terms of being clear what materials up there and making sure that the behaviour of the site follows the best medical evidence.”

The NSPCC said the rule changes marked “an important step”, but that social networks were still not doing enough to tackle self-harm.

Charity chief executive Peter Wanless said: “It should never have taken the death of Molly Russell for Instagram to act. Over the last decade social networks have proven over and over that they won’t do enough to design essential protections into their services against online harms including grooming and abuse.

“We cannot wait until the next tragedy strikes.”

Ian Russell, Molly’s father, said: “I welcome the commitment made today by Adam Mosseri to ban all graphic self-harm content from Instagram.” – PA