- Facebook content moderators are calling for an end to NDAs, which prohibit them from talking about work.
- Moderators are contractors tasked with sifting through violent content, such as suicide and child abuse.
- The moderators are also calling for better mental health support and compensation for full-time employees.
- See more stories on the Insider business page.
Facebook’s content moderators are urging the company to improve benefits and update nondisclosure agreements that they say promote “a culture of fear and excessive secrecy.”
In one letter Addressed to Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg – as well as executives at outsourcing companies Covalen and Accenture – a group of moderators said “content moderation is at the heart of the model Facebook commercial. It is crucial to the health and safety of the square public. Yet the company treats us unfairly and our jobs are dangerous.
Their claims are threefold:
- Facebook must change the NDAs which prohibit them from expressing themselves on working conditions.
- The company must provide better mental health support, with better access to clinical psychiatrists and psychologists. As the letter says: “It’s not that the content can ‘be difficult at times’, as Facebook describes, the content is psychologically harmful. Imagine watching hours of violent or child abuse content online in part of your daily work. You cannot stay unscathed. “
- Facebook should make all content moderators full-time employees and provide them with the salary and benefits that internal employees enjoy.
Facebook did not immediately respond to Insider’s request for comment. A spokesperson for the company said The edge that moderators have access to mental health care “when working with difficult content”, and that moderators in Ireland specifically have “24/7 onsite support”.
Covalen and Accenture did not immediately respond to requests for comment.
Friday’s letter comes as Facebook’s content moderators have long denounced the company’s treatment of them, even as they are tasked with sifting through horrific content on its platforms. This content may include violent physical and sexual abuse, suicides, and other visual graphic elements.
A moderator employed through Covalen, a Facebook subcontractor in Ireland, told the Irish Parliament in May that he was being offered “wellness coaches” to cope, but that is not enough .
“These people have good intentions, but they are not doctors,” moderator Isabella Plunkett, 26, said in May. “They suggest karaoke or paint but you don’t always feel like singing, frankly, after seeing someone smashed to pieces.”
Facebook CEO Mark Zuckerberg said at a company-wide meeting in 2019 that some of the content moderators’ stories about managing work were “a bit dramatic.”