CEO Mark Zuckerberg announced on Wednesday that the social network will bring on 3,000 new employees to review videos of “people hurting themselves and others on Facebook—either live or in video posted later.” “We’re working to make these videos easier to report so we can take the right action sooner,” Zuckerberg wrote, “whether that’s responding quickly when someone needs help or taking a post down.”
Zuckerberg’s announcement came in response to a series of recent incidents of violence broadcast on Facebook Live. A father in Thailand killed his 11-month-old on the live-streaming platform. Facebook admitted it took too long to remove videos of a fatal shooting in Ohio. And yet the company’s response—hiring a massive number of people to screen suicides, murders, and other violence—has raised a lot of questions.
For the people filling these positions, trauma inevitably awaits. A new short documentary, The Moderators, shows the stark reality facing the mostly Asian workers tasked with scrubbing offensive content from social media, including child porn and bestiality. Reviewing the film, The New Republic’s Josephine Livingstone wrote that it reveals “the psychological toll that exposure to these images may be taking on these laborers, who are working in vast numbers.”