Ireland is “blocking” legal actions on behalf of children affected by “addictive and dangerous” design of social media algorithms because of how it implemented a significant EU directive, the Irish Council for Civil Liberties has said.
The European Commission is currently investigating a complaint from the ICCL about how the State has transposed the EU Collective Redress Directive.
The core of the complaint is that the way in which the directive was transposed breaches EU law because it does not allow non-profit organisations such as the ICCL to raise money from third party funders for collective public interest actions.
“If the State were not blocking us from taking such cases we would already be doing so,” the ICCL’s tech expert, Dr Johnny Ryan, said on Thursday.
READ MORE
He described as “profoundly consequential” this week’s finding by a Los Angeles court jury that Meta, owner of Facebook, Instagram and WhatsApp, and YouTube, owned by Google, were negligent in deliberately designing addictive products that hooked a child user and led to her being harmed.
[ Meta and Google case: Is this social media’s ‘big tobacco’ moment?Opens in new window ]
The companies were also held liable for failure to warn users and the plaintiff, identified only as KGM, was awarded damages of $6 million.
Now aged 20, KGM gave evidence that she became addicted to YouTube aged six and to Instagram aged nine. By the age of 10, she said she had become depressed and was engaging in self-harm. When she was 13, she was diagnosed with body dysmorphic disorder and social phobia which she attributed to her use of the platforms.
Her lawyers alleged some features built by social media companies into their platforms, including video autoplay and an infinitely scrollable feed, are designed to keep people on the apps and have made the products addictive.
The KGM verdict came just a day after Meta was ordered, in a separate case in New Mexico, to pay $375 million in civil penalties. That arose from findings the company misled consumers about the safety of its platforms and enabled harm, including child sexual exploitation, against its users.
The defendant companies in both cases denied the claims and the outcomes are expected to be appealed.
Ryan said the decisions mean the tech giants will no longer be seen as “invulnerable” from lawsuits.
As well as opening the door for many similar cases in the US, there was potential for EU-wide cases which would have to be taken in Ireland because the tech companies are headquartered here.
Under the directive, non-profit bodies such as the ICCL may take public interest cases on behalf of children affected by the “addictive and dangerous design of social media algorithms”, or other violations of the General Data Protection Regulation, he said.
The ICCL already has a case in the Commercial Court against Microsoft, Ryan said, adding that is Ireland’s “first-ever class action”, under the directive.
[ Q&A: It’s been a bad week for social media. What happens now?Opens in new window ]
The directive acknowledges non-profit organisations do not have the money to take complex cases and requires states to take measures to allow such organisations to raise the necessary funding. Safeguards were also provided to ensure funding arrangements were not at odds with protecting the interests of victims.
There is “a fatal contradiction” in the State’s transposition of the directive because it allows the ICCL to bring collective redress litigation but also preserves the prohibition against raising the funds to do so at the necessary scale, said Ryan.
That prohibition is contained in Ireland’s laws on champerty and maintenance, dating back to the 17th century.
A 2022 legislative amendment provides that maintenance and champerty do not apply to commercial arbitration proceedings but the State “has not changed the rules for vulnerable children”, Ryan said.
Barrister Michael O’Doherty, an internet law expert, described the outcome of the US cases as “potentially seismic”, including for parents concerned about the impact of social media on their children but who until now had felt “powerless”.
The tech platforms may be even more exposed to similar “defective productive” claims in the EU because they do not have the same total immunity in respect of the content they host as they have under US law, he said. Under the EU Digital Services Act, such protection is “more qualified”.













