Why does nobody seem to care about the latest Facebook data breach?

With social media giants holding a monopoly, we need ways to them accountable when they misuse our data, which is where regulatory enforcement comes in

Accountability is not just about punishment – it is about making sure that we can trust the services and products we use, and that we get a huge amount from.

Accountability is a pretty simple idea: actions should have consequences. This isn’t just so we can get some sense of justice for misdeeds, but so we create incentives that make it worth a company or individual’s while to avoid them in the first place. It isn’t as simple to get right, and we making patchy progress when it comes to major tech firms.

Accountability needs two key things to work: knowledge – information about what is happening – and a response – consequences felt by whoever has done something wrong.

The knowledge aspect doesn’t happen by accident. We mandate transparency in lots of areas of public life – how tax money is spent, how services are performing, how companies behave – and we create oversight bodies with the power to look into non-public information and highlight where things are going wrong.

READ MORE

We also make sure that we have a free media capable of investigating and scrutinising actions, and making these understandable to the public.

With regards to the response, this usually comes in two ways: through regulatory enforcement of rules by a formal body, or through market forces, where “consumers” can chose to switch their loyalty, or threaten to do so, if they don’t like what someone is up to.

Let’s say a politician is caught in an act of corruption. For her to be held accountable, the existence of that act first needs to be known. A journalist might bring it to light by looking into publicly disclosed information, or an independent body with the power to scrutinise private transactions might find something and make it public.

That politician might find herself in court, if she has broken the law, or subject to disciplinary proceedings. But in democracies we mostly rely on a market response: she will stand for election, giving the citizens of her country the opportunity to vote her out of office by choosing someone else.

Details of 50 million accounts are believed to have been accessed, allowing hackers to log into Facebook as if they were the user

Or they might not. Sometimes, voters don’t care about the misdeed, or judge that a politician has atoned or compensated in some way. And sometimes voters don’t have a real alternative. In politics we call this an unfair election, in other marketplaces we call it a monopoly.

Last week Facebook announced it had experienced the largest data breach in the company’s history. Details of 50 million accounts are believed to have been accessed, allowing hackers, in theory at least, to log into Facebook – and linked apps like Tinder, Spotify and Pinterest – as if they were the user, read messages and view friends’ profiles.

The company made this information public shortly after it discovered the breach – a transparency move mandated by GDPR, the EU data protection rules, which requires companies to disclose breaches within 72 hours of discovery.

The company is sharing additional information with the the Irish Data Protection Commission, Facebook's privacy regulator in Europe. On Wednesday the commission announced it would open an investigation, which the Wall Street Journal reports could result in fines of up to $1.63 billion.

Surveys show that trust levels in Facebook since dropped by as much as 66 per cent

So far, so good, at least for a regulatory response. But when it comes to the market response, the reaction appears to be more muted. As Donie O’Sullivan, who covers cybersecurity for CNN, was quoted as saying on Tuesday, “I think we all have data breach fatigue.”

Back in March, when the Cambridge Analytica story broke, many were shocked at the way that Facebook had, in the past at least, approached data management. While much of the information about its actions was theoretically in the public domain, it was only when it was presented in a way the public could understand – most notably via Channel 4 undercover reporting – that it resonated.

Surveys show that trust levels in Facebook since dropped by as much as 66 per cent. Daily engagement by users dropped a little also, but not many people are actually shutting down their accounts. The stock price, which had fallen dramatically at the end of March, recovered, and indeed by May was higher than it had been before the story broke.

This time around, the share price has dropped only moderately – traders do not believe that this revelation will result in users turning away from the service.

That could be because users do not see a real alternative to choose to move to. Facebook has created a compelling product, and in a market where scale is key, it has triumphed over rivals to be a dominant player.

Much like we ensure real competition in elections, we have rules to make sure markets work properly. In July Margrethe Vestager, Europe’s chief monopoly fighter, levied a $5 billion fine on Google for misusing its dominance in the marketplace for phones.

Ms Vestager told the New Yorker recently that she would "keep an eye" on Facebook, saying "you are more than welcome to be successful and to dramatically outgrow your competitors . . . But if you grow to be dominant, you have a special responsibility not to misuse your dominant position."

Regulation is crucial, but rules are only starting to catch up with technology. This means we all have a role to play – as consumers, voters and citizens – in using the power we have to make technology fairer and safer.

Liz Carolan is a senior associate at the Open Data Institute and adviser to the Open Data Charter. She was co-founder of the Transparent Referendum Initiative (tref.ie)