It must seem like Groundhog Day for Facebook. The company is facing court action once more for failing to adequately protect its users’ data. This time, though, it isn’t Max Schrems, a man who has thrown an uncomfortable spotlight on Facebook’s data practices over recent years, who is pursuing Facebook. Instead, it is journalist and writer Peter Jukes, representing more than a million Facebook users in England and Wales.
The mass legal action has been launched on behalf of the group of users, claiming that the social media giant failed to protect their personal data. Filed in London, it alleges Facebook’s settings allowed a third-party app – This Is Your Digital Life – to access the personal information of users without their consent, not only harvesting the data of the app’s users, but also that of their Facebook friends.
If that sounds familiar, it should. The personality quiz was the app at the heart of the Cambridge Analytica scandal, a data breach that ended up affecting about 87 million people.
The action is seeking damages from Facebook on behalf of the 1 million UK users involved. “It is only right that we, as consumers, hold Facebook to account for failing to comply with the law and for putting our personal data at risk, to ensure that this is not allowed to happen again,” Mr Jukes said.
If only it was that simple. History tells us that – with few exceptions – large corporations rarely learn from their mistakes unless the punishment is so punitive they have no choice. Facebook has already paid fines over the misuse of data, but the numbers involved are unlikely to cause the company too much trouble, representing a fraction of the billions in profits it has banked in recent years.
But the real damage is more difficult to quantify. As countless security experts have warned, the reputational damage goes much deeper.
The social media giant has largely been the author of its own misfortune. A series of blunders has seen the company embark on a seemingly endless round of apologies. It was mired in scandal after scandal, from carrying out psychological testing on Facebook users without their knowledge or consent, to the aforementioned Cambridge Analytica scandal, to the alleged manipulation of elections, to crimes being live-streamed on the platform. It feels as if barely any time goes by without some negative news story coming to light.
Look at the uproar that greeted a poorly presented update to the terms and conditions of its WhatsApp subsidiary. Instead of being seen as a necessary move to allow businesses to communicate with their customers through the messaging platform, it sent millions of users to other platforms, pledging to abandon WhatsApp for changing the rules. It seemed as if the dire predictions that accompanied Facebook’s purchase in 2014 of WhatsApp were coming to pass: it was going after the data it had promised not to share.
And, in the dying days of 2020, there was more to come. Facebook took out a full-page advertisement to stand up to Apple, coming out against new measures in iOS 14 that show exactly what data apps are gathering and tracking, giving users more informed choices about whether they should allow it. Whoever thought that stunt was a good idea needs to take a long, hard look at themselves.
Facebook claimed to be standing up for small businesses, claiming it would make it more difficult for the companies to use their ad budgets effectively. But instead of a righteous stand against the overreach of tech companies, it smacked of self-protection. For Facebook, data is the key to its advertising success. Take away the ability to build incredibly detailed profiles of consumers and it may not be so compelling for advertising clients to spend money on Facebook ads.
As the saying goes, trust is hard won and easily lost, and, quite simply, consumers don’t really trust Facebook any more. The image of the social network as a plucky young upstart has long since disappeared, replaced by a multibillion-dollar business that profits off our data, swallows up rivals and, when it can’t do that, imitates the more compelling features of their products. David has become Goliath, and no amount of rehabilitation measures – banning vaccine misinformation, transparency on political ads, downplaying political content on users’ newsfeeds, pledging to protect our personal data – can change that.
The question should not be whether we trust Facebook. Instead, we need to ask, why should we trust Facebook?