Facebook is again facing questions over its treatment of content moderators after a moderator told a that the company is not doing enough to protect workers who sift through violent and disturbing content on the platform.
Isabella Plunkett, who currently works for Covalen, an Irish outsourcing company that hires content moderators to work as contractors, told the committee that non-employee moderators do not have adequate access to mental health resources . For example, Covalen allows an hour and a half of “wellness time” each week, but the “wellness coaches” provided by the company are not mental health professionals and are not equipped to help. moderators to deal with traumatic content that they often deal with. Plunkett told the committee that these wellness coaches sometimes suggested activities like .
“The content is horrible, it would affect anyone,” she said at a press conference after the hearing. “No one can agree to watch graphic violence seven to eight hours a day.” She said moderators should have the same benefits and protections as real Facebook employees, including paid sick time and the ability to work from home. Plunkett also referred to Facebook’s use of nondisclosure agreements, which she said has contributed to a “climate of fear” that scares moderators to speak out or seek outside help.
In a statement, a Facebook spokesperson said the company was “committed to working with our partners to provide support” to people reviewing content. “Everyone who reviews Facebook content goes through an extensive training program on our community standards and has access to psychological support to ensure their well-being,” the spokesperson said. “In Ireland, this includes 24/7 on-site support with trained practitioners, on-call service and access to private healthcare from day one of employment. We also use technical solutions to limit their exposure to potentially graphic material as much as possible. This is an important issue and we are committed to making it happen. “
This is far from the first time that these questions have been raised. The working conditions of content moderators, who spend their days browsing the worst content on the platform, have long been a problem for Facebook, which relies on non-employee moderators around the world. The company accepted last year a with US-based moderators who said their work leads to PTSD and other mental health issues.
As part of the settlement, Facebook has agreed to make several changes to the way it handles content that is sent to moderators for review. It introduced new tools that would allow them to view black and white videos with muted audio in order to make the often violent and graphic content less disturbing to watch. It also added features to make it easier to jump to relevant parts of longer videos to reduce the total time spent watching content. The company has also made significant investments in , in the hope of one day further automating its moderation work.
But Facebook may soon have to answer questions about whether these measures go far enough to protect content moderators. The committee ask representatives of Facebook and its contracting companies to appear at another hearing to answer questions about the treatment of workers.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.