Facebook whistleblower calls for independent review of Irish data watchdog

Frances Haugen calls for ‘adequate’ funding for DPC at Oireachtas committee meeting

Any harmful impacts of Facebook's planned virtual-reality metaverse platform will be hidden "behind a curtain" unless regulation of social media is reformed, an Oireachtas committee has heard.

Frances Haugen, a former Facebook employee turned whistleblower, called for the company to be "forced to be more transparent", and for the Irish Data Protection Commission (DPC) to be "adequately" funded and subject to an independent review, during an appearance before an Oireachtas committee on Wednesday.

Ms Haugen, appearing before the Oireachtas committee on culture to discuss online disinformation, said Facebook “never set out to sow discord or pull us apart”. She said that when the company tried to address the problem of people spending less time on the social media platform, it moved to prioritise “engagement”, which often meant “hate”.

The whistleblower recommended Facebook and other platforms be mandated to undertake regular risk assessments of their services, and to disclose all risks identified.

Facebook should be required to report a range of data and performance statistics, rather than just its financial profit and loss, so the company “feels that shame” of decisions that caused harm, she said.

“Content that drives young women to eating disorders is not illegal . . . If we only focus on things like terrorism content, things that are grossly illegal, we will not be able to hold these companies to account for the cost, the pollution, they are causing,” she said.

Changes brought in by Twitter to suggest people read articles or links before sharing them on the platform had not been copied by Facebook due to concerns it would impact marginally on profits, she told politicians.

Technological advancements

Advancements in technology had “always outpaced regulation”, she told the committee.

Facebook’s culture rewarded staff who focused on growth, and “averted their eyes” to problems or concerns, she said.

Ireland needed to "adequately fund" the DPC, as it had been "flooded" with data protection complaints, and had been criticised by other European regulators for issuing decisions against social media companies that did not have "enough teeth", she said.

Ms Haugen said the DPC was widely considered to have “stepped back” from properly enforcing data protection rules, and as a result tech companies based in Ireland had “once again got away with it”.

She called for an independent review into the data protection watchdog, “so that it can start to enforce the law thoroughly and boldly”.

Ireland had a “unique role” in holding tech companies to account, given many based their European headquarters in Dublin, she told the committee.

The DPC has previously rejected criticisms that it goes too easy on Big Tech platforms such as Facebook when regulating them in relation to European data privacy laws.

‘Fully aware’

Social media companies were “fully aware of the harms caused by their products”, which were ignored “in favour of growth and profit”, Ms Haugen told the committee on Wednesday.

Often "the most egregious harms" caused by decisions made in the "shiny, glass headquarters" of Facebook's Dublin office were felt in places such as Myanmar and Ethiopia, she said.

“The ethnic violence fuelled by Facebook in those countries are the opening chapters of a book too horrific to read,” she told the committee.

A new online safety commissioner to be set up in Ireland should not have a mechanism where individuals could directly raise complaints, she said, as it was likely that watchdog would be “swamped by complaints”. However, she said she would support a “class system”, where particular groups affected by online hate could raise concerns as a group.

Facebook chief executive Mark Zuckerberg said in October last in response to charges by Ms Haugen that it was deeply illogical to argue that Facebook deliberately pushed content that made people angry.

A statement from the company in the same month said: “We’ve always had the commercial incentive to remove harmful content from our sites. People don’t want to see it when they use our apps and advertisers don’t want their ads next to it.”