Facebook moderators were advised to take up “karaoke and painting” to cope with their exposure to extreme content on the platform, a whistleblower has said.
The social media giant outsources the job of its content moderators, whose role is to keep the platform safe by monitoring it for toxic content such as terrorist attacks, child abuse, self harm and graphic violence.
Workers are forced to sign non-disclosure agreements, which unions say could be illegal, meaning they are unable to speak about their experiences with friends and family.
Whistleblower Isabella Plunkett has told a Dáil committee that workers need proper psychological supports, new limits on their exposure to toxic content and an end to non-disclosure rules.
She said: “The content is awful. It would affect anyone. It finally started to get to me. I have horrible lucid dreams about all things I’ve seen.
“For months I’ve been taking antidepressants because of this content. My job is to train the algorithm.
“Facebook’s fantasy is that one day human content moderators will no longer be required.
“That means I’ll get all kinds of content. Hate speech, bullying, graphic violence, suicides, abuse, child exploitation and the list goes on.
“Some my colleagues have it even worse, they’re working child abuse queues, self-harm queues all day.
“A manager tells them that they should limit their exposure to two hours maximum a day but this isn’t happening.”
'Wellness coaches'
Ms Plunkett told the Dáil Committee on Trade that instead of clinical mental health supports, workers are offered “wellness coaches”, to cope with their exposure to the toxic material.
These people mean really well but they’re not doctors
Advertisement
She said: “To help us cope, they offer us wellness coaches. These people mean really well but they’re not doctors.
“They suggest karaoke and painting. But sometimes you don’t always feel like singing frankly.
“I got referred to the company doctor once and I was supposed to hear about a follow up but I haven’t heard anything since.”
Ms Plunkett said staff are operating in a “climate of fear” and said she was afraid to appear before the Committee because “Facebook have confused us and undermined our belief in the right to speak.”
No working from home
In addition staff say they were not allowed to work from home during the pandemic, and were not offered any explanation as to why they could not do so.
Ms Plunkett said she was instead told not to come into contact with her mother, who has had cancer on two occasions.
Facebook content moderators are not directly employed by the company, and earn around half the pay of their staff, the Committee heard.
But the campaign group Foxglove told the Committee that Facebook would be unable to function without them.
Content moderators are the people literally holding this platform together
Spokeswoman Cori Crider told the Committee: “I can put it no better than an engineer did at Facebook.
“Content moderators are the people literally holding this platform together, they are the ones keeping the platform safe.”
Foxglove, a group of lawyers and tech experts, have been campaigning for better working conditions for Facebook moderators.
Ms Crider called on politicians to intervene in the situation, saying “light touch regulations of social media has failed”.
Fionnuala Ni Bhrogain of the Communications Workers Union warned of the “chilling effect” non-disclosure agreements have on employees.
She said two employees were contacted to remind them of the agreements prior to the meeting of the Committee on Wednesday.
She said: “From the very start of their employment, this has had a chilling effect and creates an atmosphere where workers fear retaliation, meaning that they do not feel they can vindicate their right to raise legitimate issues and concerns in their workplace.
“We have been advised that workers are frequently reminded of the existence of these agreements and that they are prohibited from discussing any and all aspects of their work, whether that be with the trade union representative or even with their own families.
“Social media platforms and their outsourcing partners exploit moderators’ lack of legal training to make demands for secrecy that are possibly unlawful.
“Two moderators received messages seeking to reassert the secrecy from one of the two outsourcing firms in advance of this hearing.
“This climate of fear has chilled workers participation in legitimate democratic processes and is unacceptable.”
Foxglove raised these issues in a meeting with Tánaiste and Trade Minister Leo Varadkar in January.
However, they said his response to them arrived only at 7pm a day before Wednesday’s Committee meeting, and thus they were unable to submit it in evidence to the Committee.
Facebook have been contacted for comment.