Time for regulators to face down Facebook

Net Results: Why we shouldn’t buy the ‘co-ordinated inauthentic behaviour’ excuse

“Co-ordinated inauthentic behaviour”. If you were a powerful company seeking a disingenuous euphemism to disguise activities that in the past, you had singularly failed to identify and block – you might even say, if you had a globe-spanning business with revenue streams based on not paying too much attention to egregious behaviour – you might choose this phrase.

Facebook came up with this usefully vague expression in its announcement this week that it had removed 32 accounts and associated ads intended to influence the upcoming US mid-term elections. The November elections will see significant numbers of US congress and senate seats up for grabs, enough potentially to flip either Republican-controlled house to the Democrats.

About 300,000 people had followed the pages, some of which used Facebook’s event-planning feature to co-ordinate 30 events that 4,700 people indicated they’d be interested in attending.

While Facebook said it could not discern who had created the pages (they were set up through identity and location anonymising tools), it stated that some of the activity was consistent with what it had seen before on the platform from Russia's Internet Research Agency (IRA) prior to the 2016 US presidential election.

While Facebook is to be applauded for identifying the accounts and shutting them down, doing so in this “oh, by the way” fashion fails to get at the heart of the problem of “co-ordinated inauthentic behaviour” (which frankly, just sounds like a boring flashmob), nor does it offer any feasible solution.

Still, this conveniently-timed company announcement got more attention in the US than the far more revealing, wide-ranging and Facebook-eviscerating UK parliamentary select committee report into dangerous and damaging fake news (aka co-ordinated inauthentic behaviour), released over the weekend.

‘Dark ads’ 

The report included screen shots of an appalling range of pro-Brexit “dark ads” on Facebook, released publicly by the committee after Facebook finally, after much opposition, handed them over last Friday.

Dark ads are highly targeted ads that only appear to a target group and cannot be seen – and therefore, refuted – by others. The ads were not identified as campaign ads, nor did they state who had funded them. Some were click-through ads intended solely to gather data on individuals. Others included out and out lies – such as the statement that Turkey was about to join the EU.

Vote Leave volunteer turned whistleblower Shamir Sanni revealed over the weekend that the Vote Leave campaign– which had agreed to abide by a nationwide, temporary suspension of Brexit campaigning following the murder of Labour MP Jo Cox – nonetheless ran these ads in the hours after the moratorium was agreed.

The parliamentary report is a model of what a dedicated, fearless and articulate group of elected representatives can produce. The detailed document, which took months to compile, incorporates evidence from interviews with 61 witnesses, 20 oral hearings (including the widely-viewed sessions with Cambridge Analytica chief executive Alexander Nix and company whistleblower Christopher Wylie), and 150 written submissions.

The 12 MPs on the committee have done the job the US congress has so far failed to do – not least in its mostly lukewarm questioning of Facebook chief Mark Zuckerberg when he appeared before its committees last spring. And they've completed a telling initial investigation that should have informed our own politicians as they questioned Facebook on Wednesday about the shocking recent Channel 4 undercover Dispatches programme – which showed that content moderators were told to leave up alarmingly racist, violent, and abusive content. And don't forget, the Dáil also awaits Facebook's promised volumetric data on ads run during the Eighth Amendment referendum, which could be very interesting indeed.

The UK report follows investigative threads that link Facebook pages and ads, Cambridge Analytica's secretive and abusive data-gathering, and Russian interference in British elections. Committee chair Damian Collins told The Observer he believed the evidence received so far from Facebook represented just the "tip of the iceberg". Just think about that.

The report concludes with recommendations on regulating platforms and technology companies such as Facebook that should be closely examined in Ireland, too, and in the rest of the EU and US. It recommends that lawmakers introduce "clear legal liability" for such companies "to act against harmful and illegal content" – a proposal that seems even more prescient in light of the Dispatches programme.

Companies should be fully audited, especially in regard to security and mandated transparency of their highly secret algorithms that govern how content and data is managed, says the report. Targeted “micro ads” for political campaigns should be banned. Electoral commissions should have far greater powers over advertisements.

Companies should adhere to a code of ethics. And, in a move that would address many of the legal, political and societal problems introduced by these huge platforms, they should be redesignated into a category in between publisher and platform that would demand greater responsibilities and liabilities.

Will the UK and EU – or a US that is increasingly reluctant to regulate business – take these meaningful actions? Or will our own politicians, when Facebook and so many of these companies have their European base in Ireland?

Based on this overwhelming, and still growing body of evidence: they must.