Hundreds of jobs could be at risk in TikTok’s UK operation as the company utilises artificial intelligence (AI) more widely in content moderation.
The move, which will impact the trust and safety teams in the UK, is part of a wider shuffling of roles in the UK, South Asia and South East Asia.
The plan will see some of the work reallocated to other European sites, but TikTok did not give details on which locations this may relate to. The reorganisation is not expected to impact its Irish operations.
Other trust and safety team roles, as well as operations, will remain in the UK, where TikTok currently employs 2,500 people.
READ MORE
“We are continuing a reorganisation that we started last year to strengthen our global operating model for trust and safety, which includes concentrating our operations in fewer locations globally to ensure that we maximise effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements,” a spokesperson for TikTok said.
Despite the cutbacks, TikTok is still recruiting in the UK and the company said it was on track to end the year with a larger headcount than at the beginning of the year.
TikTok said affected employees would be able to apply for internal roles, and suitable candidates would be given priority.
TikTok’s human-safety moderation teams are trained to spot accounts possibly being used by a child. They can then suspend the account. AI-based systems can identify information, such as keywords and in-app reports from the user community, that might point to a potentially underage account.
Automated technologies can also reduce human moderators’ exposure to distressing content. TikTok said there had been a 60 per cent decrease in such content viewed by moderation teams in the past year.
Automated systems currently account for more than 85 per cent of content removed from TikTok for violating community guidelines.
TikTok’s UK operations are subject to the UK’s Online Safety Act, which came into force last month and requires online platforms to protect UK viewers from illegal material such as child sexual abuse and extreme pornography.
Platforms are also required to prevent children from accessing harmful and age-inappropriate content. - Additional reporting: PA