TikTok and children

Dangerous content

Letters to the Editor. Illustration: Paul Scott

Sir, – While shocking, RTÉ's Prime Time report will come as no surprise to those of us working to safeguard children and young people in Ireland.

A particularly frightening aspect of this investigation is that the Prime Time team did not seek out topics, “like”, comment, or engage with any particular content, however within minutes the accounts which TikTok assumed were those of a 13 year old were brought down a “mental health rabbit hole”.

At Childline, we hear from children and young people every day about the affect that technology can have on their mental health, but to see it played out in real time was truly shocking.

Since February alone, 145 children have reached out to us at Childline to talk about self-harm, and 65 of those children are repeat contacts who have then gone on to reach out to us for help with suicide ideation.

READ MORE

Remember, this is more than mere numbers.

Behind all of our contacts is a child or young person desperate for help. So I ask, with these shocking statistics in mind what are the technology platforms really doing to safeguard children?

Let’s be clear they are for-profit businesses that are not in the business of child protection.

It is painfully evident from RTÉ's investigation that they are pumping out the most dangerous messages possible to already vulnerable children and it is high time steps are taken to address dangerous algorithmic amplification.

We must have the correct regulations and comprehensive legislation in place so that platforms cannot be permitted to continuously bombard children and young people with dangerous content. It is the responsibility of the platforms that create and implement the algorithms to protect their users from such harm. Simply put, they must put children’s safety ahead of profit-driven interests. – Yours, etc,

JOHN CHURCH,

CEO,

ISPCC,

Dún Laoghaire

Co Dublin.