TikTok steps up efforts to tackle Covid-19 vaccine misinformation
Company also updates community guidelines on acceptable content
TikTok has also strengthened its policies on bullying and harassment of victims of abuse. Photograph: Hollie Adams/Bloomberg
TikTok is stepping up its efforts to combat misinformation on the Covid vaccine on its platform, strengthening its policies as the vaccines are introduced worldwide.
The company said it would direct users searching for vaccine-related content to trusted information from respected experts via a banner. It will also detect and tag all videos with words and hashtags related to Covid-19 vaccines, and a banner will be attached to videos that redirect users to verifiable sources of information.
TikTok already has a policy of removing misinformation on vaccines from its platform, and said it would continue to do so.
In a blog post, TikTok Europe’s head of product and process Kevin Morgan said the company was furthering efforts to make authoritative information about vaccines readily available.
“Keeping our community safe is a commitment with no finish line,” he said. “We recognise the responsibility we have to our community to be nimble in our detection and response when new kinds of content and behaviours emerge.”
Separately, the company said it would update its community guidelines and provide additional resources to support user wellbeing.
The move is part of a wider strengthening of TikTok’s community guidelines. Among the changes are improvements to its policies on self-harm and suicide content, with feedback and language used by mental health experts incorporated into the guidelines. The platform said it would avoid normalising self-injury behaviours and dangerous weight loss behaviours.
TikTok has strengthened its policies on bullying and harassment of victims of abuse, making guidelines on banned behaviours and content – including doxxing and cyberstalking – more explicit. The company has also adopted a more extensive policy against sexual harassment.
The platform has taken a tougher line on the promotion of dangerous dares, games and other acts that may jeopardise the safety of younger users, and updated guidelines covering the issue of violent extremism to be more detailed on what is considered a threat.
Updated resources will be introduced over the next week to support people who may be struggling with self-harm or suicidal thoughts.
In the first six months of the year, TikTok removed more than 104 million videos globally for violating the community guidelines or terms of service, about 1 per cent of the content uploaded to the platform. The majority were removed before they could be reported or even viewed.