Subscriber OnlyOpinion

There is no reason to treat directors of social media companies differently

Like other directors they should be liable for harm caused by their company

Personal liability is imposed on company directors and others where an act or omission by them has led directly to the commission of an offence by their company.

This applies in health and safety legislation, environmental legislation, consumer protection legislation, under the Companies Act, under the Competition Act, the Data Protection Act.

The courts have determined that where an individual has direct responsibility for an area in a company and he or she wilfully or negligently allowed that company to commit an offence, then he or she can be prosecuted for that offence.

There is very good reason for these provisions. It means that senior people in companies take responsibility to minimise the risks of harm that their companies could cause. If a company is only likely to face a fine for reckless or negligent behaviour then they may calculate that the risk could be worth it given the profits that could be made. If the director could be facing prison time for knowingly allowing such behaviour to commence or to be perpetuated, that will almost certainly make that director think twice.

READ MORE

Unimaginable

There is much that is positive about social media. It allows us to connect and communicate and organise and share information in ways that were unimaginable a few decades ago. As a consequence of gathering so much data on their users, social media companies are now in extraordinarily powerful positions globally.

The companies have said they are committed to making their platforms safe places for users. Yet the evidence to date is that they are continuing to fail to do so.

Over the last 18 months, as members of the Oireachtas Media Committee, we have listened to children’s rights groups, young people, those dealing with addiction, those who are victims of online bullying and trolling, whistleblowers from inside the companies, legal experts. Again and again, the view being expressed is that social media companies are not taking the question of online harm sufficiently seriously.

We also know how social media companies have allowed our data to be deliberately misused and how their algorithms push users in particular directions. This is evidenced politically around the Cambridge Analytica scandal and the riots at the US Capitol in January 2021. But it also in areas such as where young people with concerns about their body are driven to sites that encourage eating disorders.

Every time activities such as these are exposed, the social media giants apologise and state it won’t happen again. But it does. Mark Zuckerberg of Facebook has testified several times before the US Congress and the most common word that he has used is “sorry”.

Safety

Our first obligation as legislators is the safety of our citizens and those who reside here. That applies as much to the digital space as it does to our streets and our communities.

We have seen when the social media companies have been responsible for wrongs such as large data breaches, they receive what many of us would regard as enormous fines. But these companies simply write them off as business costs.

If a health and safety director of a social media platform became aware of risky behaviour at one of their offices that could result in serious injury to employees or customers, they would immediately take action to prevent or minimise it. They would promote a culture of good health and safety practice and examine all potential risks. Partly, because it is the right thing to do but also because they know that if they didn’t take action, they could be held personally liable in the event of an incident.

Should the same principle not apply to a director of online safety or the director responsible for designing a company algorithm? Should they not be encouraging a culture to minimise online harm and examining all risks?

If we can develop this culture of safety by design in social media companies, where when algorithms and platforms are being created, that the best interests and safety of citizens are prioritised, that our data will be used responsibly, this will make for a much better digital environment for everyone.

Culture change

Self-regulation by the social media companies is not working. Fines after investigations by a new regulator will be written off as business costs, but the harm will have been done to individuals in the meantime. Requiring culture change in companies by holding individuals responsible has worked in many other areas of our business environment. We need to apply the same standards to online safety.

It is for these reasons that we seeking to amend the Online Safety and Media Regulation Bill, which comes before the Seanad this week, to strengthen it by holding individuals to account. We have discussed this with Minister for Media Catherine Martin and her officials, whom we know are also deeply committed to making the digital space safer.

This will be a critical debate around what values we want to underpin social media design and interaction.

Malcolm Byrne and Shane Cassells are Fianna Fáil Senators and members of the Oireachtas Media Committee. The Online Safety and Media Regulation Bill is before the Seanad at committee stage this week.