Children are seeing things no one should see on smartphones. What can be done about it?

This week the US surgeon general warned of ‘ample indicators’ that social media can have ‘a profound risk of harm to the mental health and wellbeing of children and adolescents’

Social media risks for kids

Digital-savvy parent Emma thought she was doing enough to help her 13-year-old son negotiate the world of social media safely until he inadvertently stumbled on a disturbing video clip on his smartphone last week.

A first-year pupil at a south Co Dublin secondary school, he had been added to a Snapchat group by a girl he vaguely knew. After a relaxed Friday night family dinner, the boy went up to his bedroom where he saw a notification on his phone from another member of this group chat. On opening it, footage of a cat being tortured to death played automatically without warning. Although he stopped watching as soon as his brain registered what he was seeing, “it left an image for him to fill in the blanks”, says his mother, “so it has been very difficult dealing with him. He has been so upset”.

The boy left the group straight away, before telling his parents. Due to the ephemeral nature of Snapchat, “there was no concrete evidence to hold anyone accountable for the distress caused”, says Emma, who asked that her name be changed to protect her son’s identity. To report the image – for all the good that would have done as this video is believed to have been shared widely across different media platforms in recent weeks – he would have had to hold his finger down on it.

She knows her son is only one of millions of children across the globe who are “seeing things that nobody should ever see”. Parents have a “duty to educate and protect our children”, she acknowledges, “but we cannot bear this responsibility alone. Social media platforms must be held accountable for the traumatic content they allow to circulate”.


It is only 16 years since the first iPhone was released in the US, in 2007. Since then the smartphone has transformed so many ways we function, for better and for worse. We are still struggling to understand the impact, not least on those born into a fully digital world. At times it seems the smartphone is implicated in every problematic aspect of youngsters’ lives, clouding the positive opportunities for connection and information it offers. But the red flags are too numerous to ignore.

This week the US surgeon general, Vivek Murthy, warned of the “ample indicators” that social media can have “a profound risk of harm to the mental health and wellbeing of children and adolescents”. Addressing a societal issue that is too often still regarded as primarily a parenting problem, he called for a “multifaceted” effort to maximise the benefits and reduce the risk of harm posed by social media. His 19-page advisory recommends steps to be taken by five categories of players in this: children and adolescents themselves; parents and caregivers; technology companies; policymakers; and researchers.

Murthy’s intervention comes more than a year after the American Psychological Association (APA) called on him to alert the public to how psychological scientists were warning that the use of digital media platforms could exploit biological vulnerabilities among youth. “It has long been established that adolescence is associated with neurological changes that promote cravings for social attention, feedback and status. Research demonstrates that digital media satisfies these cravings at a neural level, activating the same neural regions as drug addiction,” the APA said.

A UK’s coroner ruling last September that 14-year-old London schoolgirl Molly Russell had “died from an act of self-harm while suffering from depression and the negative effects of online content”, was described by campaigners as a global first and a “big tobacco moment” for social media. After her death in 2017, the teenager’s heartbroken family doggedly demanded records of her social media history. It revealed what her father described as a “demented trail of life-sucking content”. As with any user, the recommendation engines of Instagram and Pinterest, powered by artificial intelligence (AI), had unrelentingly pushed content at her, based on its computations of what she was most likely to click. Social media platforms know engagement is more intense when high emotions, such as anger, disgust and despair, are aroused.

“It’s a ghetto of the online world that once you fall into it, the algorithm means you can’t escape it,” said Molly’s father, Ian. Of 16,300 pieces of content saved, liked or shared by Molly on Instagram in the six months before she died, 2,100 were related to suicide, self-harm and depression.

There has “undoubtedly” been a Molly Russell among teenage suicides here in Ireland, says Fiona Jennings, head of policy and public affairs at the Irish Society for the Prevention of Cruelty to Children (ISPCC). Her organisation approaches the use of social media from the perspective of children’s protection rights versus their participation rights.

“It can be a difficult balance. We firmly believe children should be in the digital environment and avail of all the wonderful opportunities that are there.” Yet there is still not enough evidence to conclude that social media and the wider online environment is safe enough. It is necessary to look at how a tool of connectedness has an anti-social flipside, she suggests, increasing the sense of isolation among young people looking on.

The chief executive of CyberSafeKids ( in Ireland, Alex Cooney, notes the new tack taken by the US surgeon general. “Often we have heard from academics that we just don’t have the evidence to link declining mental health with the use of technology and, more specifically, social media. What he said is we don’t have the evidence that social media doesn’t harm them.” Murthy spoke of the “gaps in our full understanding” of the impact of social media on mental health and added “at this point [we] cannot conclude it is sufficiently safe for children and adolescents”.

Cooney wonders about the nature of information that has been gathered at US government level to prompt the surgeon general to speak out so strongly. Likewise here, where Minister for Health Stephen Donnelly spoke recently of the need to “get behind the curtain” of how tech firms operate, to protect children from the disruptive and damaging bombardment of content by algorithms.

Murthy highlighted how research has shown that children and adolescents who spend more than three hours a day on social media face double the risk of mental health problems, including symptoms of depression and anxiety. Here in Ireland, girls reporting the use of social media for three or more hours daily has increased from 54 per cent to 66 per cent over three biennial surveys conducted in the west by Planet Youth (, which is piloting an Icelandic prevention model to improve young people’s wellbeing. Its latest findings released this week show boys reporting the use of social media for three or more hours daily increased from 37 per cent in 2018 to 50 per cent in 2022. Boys are also much more likely to have additional daily screen time when gaming, with one in four saying they spend three or more hours daily on video games alone.

“The relationship between screen use and mental health in these Irish surveys is very stark,” says Planet Youth coordinator Emmet Major. “A simple correlation in the survey shows that 27 per cent of girls using social media for three hours or more daily report good mental health whilst 44 per cent of the girls using social media for less than three hours daily report good mental health.” The corresponding figures for boys are 52 per cent and 68 per cent respectively.

US psychologists Jonathan Haidt and Jean M. Twenge took a global perspective to produce compelling correlations to suggest that the rapid transition to social lives mediated by smartphones is the “prime suspect” in worldwide rise of rates of teenage depression, loneliness, self-harm and suicide. They used data collected by the Programme for International Student Assessment every three years since 2000, which showed school loneliness remained relatively stable for more than 10 years before increasing sharply over 2012–2018 in 36 out of 37 countries.

“This synchronised global increase in teenage loneliness suggests a global cause, and the timing is right for smartphones and social media to be major contributors,” they wrote in the New York Times in 2021, on publication of their study in the Journal of Adolescence. They looked at other global trends that might have had an impact on teenage loneliness such as rising income inequality, unemployment and decline in family sizes, but “only smartphone access and internet use increased in lock step with teenage loneliness”. While causation could not be proven, they were confident more years of data would help to provide a more complete picture.

Number one in Murthy’s list of steps for technology companies to take is: “Conduct and facilitate transparent and independent assessments of the impact of social media products and services on children and adolescents”. Social media companies do this research but sharing it is another matter. It took a whistleblower to disclose in 2021 how Facebook had kept secret internal research that concluded its Instagram app aggravates body image issues for teenage girls. Staff at the company who had been studying the impact on younger users repeatedly found it had harmful effects for a large proportion, but particularly teenage girls. “We make body image issues worse for one in three teen girls,” said a slide from one internal presentation in 2019.

The wheels of policy and regulation of online services move slowly. Seven years after the Law Reform Commission here recommended a new statutory oversight system, the State’s first Digital Safety Commissioner, lawyer Niamh Hodnett, was appointed last January to promote digital safety and to oversee efficient take-down procedures by online services. Categories of harmful online content that she is mandated to tackle include illegal material relating to child sex abuse or terrorism; content “readily identifiable” as cyberbullying; also material promoting eating disorders, or promoting or providing instructions for suicide and self-harm. But it remains to be seen what resources will be put at her disposal to try to make this watchdog role effective.

In the US, states are taking their own initiatives. In Utah, for example, from March 1st next year social media companies will have to implement a curfew for minors in the state, barring them from their social media accounts from 10.30pm to 6.30am. Cooney, for one, says she will be fascinated to see how that works.

Meanwhile, social media is and will continue to be a daily issue for parents. Knowing AI “friends” have been added into the mix is the latest worry. It is an inconvenient truth that many adults model excessive use of smartphones and then wonder why small children are so attracted to them and teenagers cannot put them down. We are all up against the guile of corporations that use behavioural scientists and developmental psychologists in determining what engages users of their online products, no matter what their age.

Cooney says parenting was never easy but social media has made it so much harder. “We are having to check it; create boundaries around when, where and how it’s used, and to encourage self-regulation.” Parents are also trying to reassure children and teenagers that they are perfect as they are, “although everything is screaming at them online that they are not perfect”, she says. “What frustrates me is when I talk to companies I hear ‘we are going to create more parental tools so parents can keep an eye on what their kids are doing. . .’ They are suggesting it is a parenting issue alone.”

Psychotherapist Richard Hogan, director of the Therapy Institute and author of Parenting the Screenager, sees online technology “massively disrupting” family environments. But, he contends, “if schools and families work together as a community we have a solvable issue”. For a start he would like to see parental solidarity around primary schools initiatives to not allow children have smartphones before sixth class. When some parents give in, other children can be isolated for not having the technology, he points out, but it is too powerful a force for under-12s. “I was in a school recently and a principal told me a senior infant had consumed pornography.”

When it comes to helping children navigate the online world, we know many parents can feel overwhelmed, says Jane McGarrigle of Webwise (, which is funded by the Department of Education and the European Commission to promote safer and better internet use. It provides resources to help parents talk to children about it. Young people stress the importance of the internet in their lives and sometimes they feel that adults don’t “get it”, says McGarrigle. But equally, they recognise they need support from school and parents and guardians. “It is important that we listen to young people and include them in conversations.”

The ISPCC also has a digital hub ( for parents and, as Jennings points out, “for a child or young person to come to a parent or trusted adult with an issue, they have to some sort of confidence that they kind of know what they’re talking about”. Concerted and sustained action is needed at so many levels to try to ensure children and young people can live more safely in the digital world adults have created for them.