An Instagram for children platform would be akin to an "alcopop" used to recruit for future commercial gain, an Oireachtas committee has heard.
Head of the Irish Society for the Prevention of Cruelty to Children (ISPCC), John Church, was also critical of tech companies' track record in balancing the needs of children with shareholder interests.
He was addressing the Oireachtas Committee on Media during its pre-legislative scrutiny of the Online Safety and Media Regulation Bill 2020 and broader issues around children’s use of social media platforms.
Asked about reports that Facebook was considering launching a version of its popular Instagram platform for users under the age of 13, Mr Church replied that "anything that's set up directly at kids is almost like the alcopops for platforms. It's just a form of a private platform trying to recruit for future sustenance."
Alex Cooney, chief executive of the CyberSafeKids charity, noted that Snapchat had a children's version but that eight to 13-year-old children, in their experience, were more interested in the regular version used by their elder peers.
Mr Church also expressed concerns over end-to-end encryption which he said was being planned by tech platforms in order to ensure user privacy.
“That has an unwitting issue of actually protecting perpetrators online as well with the online trading, which is a terrible term to use, but it’s a big business, online trading of child sexual abuse materials,” he said.
“We landed a helicopter on Mars, surely we can actually protect children at the same time as providing privacy.”
He was critical of tech companies’ progress in tackling child safety procedures and raised the issue of balancing shareholder values with safeguarding children.
“We would have regular meetings on a quarterly basis with the likes of Facebook [and] TicTok,” he said, adding that the investment was “nowhere near” what was required.
Are the restrictions working?
Asked by Sinn Féin TD Imelda Munster about the age of people signing up for social media accounts, Ms Cooney replied that while minimum ages are generally between 13 and 16, they were aware of "quite some numbers" of children aged eight and nine.
“Clearly the restrictions that are in place do not work,” she said. “That’s something that we need to look at and it may be that we could look at other ways around this without, what we don’t want is that children have to hand over more of their data in order to determine their age.” Ms Cooney said a possible solution would be the scanning of data for verification that is immediately deleted.
In its submission to the committee CyberSafeKids argued that the forthcoming legislation on internet safety must “explicitly name” an online safety commissioner in order to make the creation of such a post “clear and specific”, a point echoed by others during the proceedings. It also said the Bill is lacking in addressing the need for a strong individual complaints mechanism.
The Children’s Right Alliance similarly criticised the scheme of the Bill for not specifically stating that an online safety commissioner would be nominated from an overall media commission.
The ISPCC noted a recent study by the National Anti-Bullying Research and Resource Centre at Dublin City University that found 28 per cent of children surveyed reported being victims of cyberbullying during the recent lockdown.