An explosive report from The Verge found that YouTube content moderators were forced to sign a legal paper through Accenture acknowledging that their job could cause PTSD. Later on Friday, Verge reporter Casey Newton confirmed on Twitter that the legal document had been sent to Facebook moderators as well, linking to a Financial Times report. Newton added that he was in the process of investigating whether Twitter moderators were also forced to sign a similar paper.
In a statement to The Verge, an Accenture spokeswoman said the company, which runs a moderation website for YouTube, is dedicated to the "well-being" of its moderators. The purpose of the document, she said, was to ensure workers had a "clear" idea of the kind of content they would moderate. But others, including moderators, strongly disagree.
What the document states — The "Acknowledgment" section of the document requires signees to express agreement with the following: "I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD).
"I will take full advantage of the weCare program and seek additional mental health services if needed. I will tell my supervisor/or my HR People Adviser if I believe that the work is negatively affecting my mental health."
Major red flags — Although the document lists "resources" for these emotionally fatigued workers, it's worth noting that Accenture's "wellness coaches" are not licensed therapists or medical doctors. The paper also requires workers to inform HR if they worry their mental health is "negative." It does not clearly assure the worker that HR won't view them as a liability and terminate their contract after this personal disclosure.
Plus, some workers have even said that they've been threatened with being fired if they didn't sign the paper. Issues like unclear language, reports of termination threats, and inadequate resources for worker health will justifiably raise major labor red flags among onlookers.
For several years now, journalists have dived deep into the dark world of content moderation. Complaints about disturbingly low wages, stress, depression, suicidal feelings, and little to no corporate support are abundant once you look up reports on content moderation for social media.
Yet in spite of these deeply disturbing reports involving deteriorating mental and physical health, there has been virtually nonexistent improvement. With The Verge's report out, it is chillingly clear that these companies, who laud progressiveness publicly and vow commitment to employees' happiness, are still very much the root of the malaise and misery.