We all must do more to stop online abuse of children and adults

Digital safety decisions must not be rushed because the issue is topical

Government and Opposition support to appoint a digital safety commissioner, as recommended in a 241-page report by the Law Reform Commission published in 2016, has been gaining momentum ever since the Horan case reached its conclusion last week.

While cases involving such appalling manipulation of young children are rare in Ireland, there's absolutely no doubt that the internet creates environments where awful activity happens, to children and adults.

The technology companies, large and small, that provide the apps, the social media platforms and the other web spaces where such abuse may occur, must do more. Most have not done remotely enough to limit or prevent such activities, especially with regard to children.

But it is important for the discussion around these issues not to default to a more limited focus on children, as has begun to happen here. And it is critical that any decisions made concerning new statutory posts be carefully considered, not rushed in because the issue is topical and politicians feel the need to be seen to act.

READ MORE

The role, as defined in the report, would have enormous reach to independently judge whether internet content should be removed, using a code of practice yet to be developed. The commissioner would also be able to impose fines if takedown notices weren’t complied with.

Australia

The commissioner’s office would also have a significant educational role, to promote digital safety for adults and children. For the latter, it is suggested this would be done in tandem with the Office of the Ombudsman for Children (OCO).

Would any of this actually work? Or would the office just become an ineffectual, tax-eating quango?

The proposed Irish office is intended to be closely modelled on the eSafety commissioner role in Australia, established in 2015.

Without adequate transparency, it is impossible to gauge the extent to which social media platforms are removing content in a way that is proportionate and legitimate

In its first year of operation, it received over 11,000 complaints. It issued no fines, though it can inflict up to Aus$18,000 (€11,700) a day for non-compliance with a takedown notice.

In late 2016, Australian group, Digital Rights Watch expressed concern about the takedown process, in a report to the United Nations special rapporteur on the protection of the right to freedom of opinion and expression.

"Without adequate transparency, it is impossible to gauge the extent to which social media platforms are removing content in a way that is proportionate and legitimate. There is little accountability to identify when content has been wrongfully removed, and there is no formal system of appeal or due process," it stated.

I asked Matthew Rimmer, professor of IP & Innovation Law at Queensland University of Technology (QUT), about Australia's experience over two years on.

While government and opposition politicians here have spoken about the Australian role as if it is a well established success, Rimmer notes: “The Australian eSafety commissioner is still a relatively new regulatory experiment by the Australian Government” and possesses “a mix of soft and hard regulatory powers” – an oversight and also educational role.

The position emerged from the same public concerns present now in Ireland: “Problematic areas in Australian law – cyberbullying, image-based abuse, online child sexual abuse material – which are not adequately dealt with by police or existing regulators and our current patchwork laws.”

Censorship

He adds: "Some critics have been concerned that the body will be paternalistic and intrude into internet censorship and surveillance. Others wonder whether the body is a bit anodyne – just a bit of symbolism, rather than [providing]strong regulation, designed to convince voters that politicians are doing something about the topic."

Overall, he feels “mildly positive” about the role over two years on. He doesn’t think surveillance or censorship fears have been realised, but also thinks the role is underdeveloped. “It still seems very early days.”

The eSafety role, he thinks, addresses “genuine child safety issues in the digital environment which haven’t been well dealt with by governments or IT companies or the community”.

That last observation is, I think, key. This isn’t just about the platforms or the apps or websites. Both governments and communities – including parents – must do more, too.

But the regulatory and management line is so fine. Government intervention on the base of moral panic genuinely risks serious consequences: unwarranted restrictions, a sour business climate, public surveillance, censorship, a nanny state for both adults and children, who must be given the wherewithal to navigate the digital world, not be sequestered from it.

Ireland has poor form in such areas: a long history of state censorship, and of imposing a data retention regime that remains in breach of a 2014 European Court of Justice ruling in a case brought by Digital Rights Ireland.

If a new role is to be created, Ireland must move carefully. Not only must it (unlike Australia) create such a role within EU laws on privacy, surveillance and e-commerce, but the State also needs to ensure the new office doesn’t become a tool acting on any given government’s oversight whim.