Man sues Facebook over having to watch ‘extremely disturbing content’
Former content moderater claims psychological injuries from ‘graphic and violent’ posts
A former content moderator is suing Facebook Ireland for psychological injuries he claims he suffered as a result of his work
Facebook CEO Mark Zuckerberg in Dublin last April. The company said it provided extensive training and full-time support to moderators. Photograph: Tom Honan/The Irish Times.
A former content moderator is suing Facebook Ireland for psychological injuries he claims he suffered as a result of his work, which involved “repeated and unrelenting exposure to extremely disturbing, graphic and violent content”.
In one of the first cases of its kind to come before a European court, Chris Gray began a legal action against the Irish subsidiary of the California tech giant in the High Court on Wednesday, related to his work moderating content for Facebook in Dublin during 2017 and 2018. He is also suing CPL Solutions, which hired him as a contractor.
Mr Gray was one of about 15,000 moderators working for Facebook worldwide, reviewing content posted on the social network, and deciding whether it should be filtered or removed.
He says in his personal injuries claim he was required to view “very disturbing” photographs and videos, including executions, lethal beatings, stonings, whippings, the abuse of children, animal torture and extreme sexual content.
Among the material he had to view was footage showing the large-scale and coordinated abuse and murder of the Rohingya people in Myanmar, massacres in the Middle East and the torture of apparent migrants in Libya.
Compounding the impact of the distressing imagery was the pressure to make the right decision about what was to be done with it, and maintain a 98 per cent accuracy rating, he claims. Over time, he became “numb and desensitised” to the content and increasingly irritable, sensitive, argumentative and aggressive.
Mr Gray’s legal action is expected to be the first of a number of cases to be taken by former moderators for Facebook
In his legal proceedings, Mr Gray (53) claims he noticed a “slow creep” whereby his “personal and political views were becoming increasingly influenced by the insidious content he was required to view”.
He alleges his work affected his sleep and that he often dreamt about the material and woke “with a fright, concerned not by the content, but by whether or not he had marked it correctly during his shift”.
Mr Gray claims he faced into “what seemed like a relentless flow of extreme and graphic material” without adequate support or training. His demeanour gradually changed, he alleges, and he was unable to discuss work-related issues with his superiors in a calm and professional manner.
A spokeswoman for Facebook said it recognised “reviewing certain types of content can sometimes be difficult”, but that it provided extensive training and full-time support to moderators. It also employs technical solutions to limit their exposure to graphic material as much as possible.
“This is an important issue and we are committed to getting this right,” said the spokeswoman.
‘Training and support’
CPL was not available for comment but it previously said it cared “deeply about our employees and take any concerns they raise very seriously” and that “extensive training and support” was provided to content moderators.
Mr Gray’s legal action is expected to be the first of a number of cases, possibly as many as a dozen, to be taken by former moderators for Facebook who claim they are are suffering from post-traumatic stress disorder.
Their cases are being handled by the Dublin legal firm Coleman Legal Partners. Lawyer Cori Crider – a director of Foxglove, a UK-based not-for-profit group supporting the Irish case against Facebook – said she wanted to see “social media’s factory floor cleaned up”.
“In a few years’ time we are going to look back on these conditions and see them the way that we now see early unsafe factory work in a steel mill or a meat-packing plant in the early 20th century,” she said.