Victims of harmful online abuse should be able to sue social media platforms hosting them, while platforms that refused to remove harmful material should be prosecuted, an Oireachtas committee heard this week.
The Committee on Justice was discussing online harassment on Wednesday and heard anyone who had an online identity should be compelled to disclose their offline identity, while an online regulator “with teeth” was necessary.
Colm Brophy TD (Fine Gael) said while not in favour of censorship generally he believed it was "completely insane" that large, international publishing platforms were unregulated.
“It’s impossible to comprehend the idea of television, or radio or newspapers blanketly saying, ‘Even though this is being broadcast by us, transmitted by us, published by us, we have no responsibility at all’ and yet for some reason . . . we continue to tolerate this notion that large, multi-billion pound corporations can continue on in this fashion.
“One of the things we should move towards is that people who are victims would have the right to sue the platforms . . . If we were in that situation and we had a few substantial settlements I believe the providers would move very quickly.”
Caroline Counihan, legal director with the Rape Crisis Network of Ireland, agreed victims should have that right and "a few hefty settlements would soften the cough of those online platforms that have not been careful".
However, if this was the only recourse available in instances of online sexual harassment or abuse, it put too much onus on victims who wanted images and material removed as quickly, privately and cheaply as possible.
Jim O’Callaghan TD (Fianna Fáil) suggested the most effective mechanism would be an online regulator with the power to order the removal of abusive material.
Ms Counihan said: “My feeling is that if it is done properly that has the potential to be a quick, safe, private way to get the images down with the greatest possible speed . . . Survivors really don’t need the extra trauma of delay and uncertainty.
“Ultimately I don’t see why these people should not be criminalised if they don’t take down material that they have been put on notice is harmful.”
John Church, chief executive of the ISPCC, said a regulator could help platforms by assisting them identify harmful material. He also said children were curious and wanted to visit sites that were not age-appropriate.
“Children lie about their age and I think there needs to be a greater onus on social media platforms to verify that age.”