On a busy day, contract employees in India monitoring nudity and pornography on Facebook and Instagram will each view 2,000 posts in an eight-hour shift, or almost four a minute.
They are part of a 1,600-member team at Genpact, an outsourcing firm with offices in the southern Indian city of Hyderabad that is contracted to review Facebook content.
Seven content reviewers at Genpact said in interviews late last year and early in 2019 that their work was underpaid, stressful and sometimes traumatic. The reviewers, all in their 20s, declined to be identified for fear of losing their jobs or violating non-disclosure agreements. Three of the seven have left Genpact in recent months.
“I have seen women employees breaking down on the floor, reliving the trauma of watching suicides real-time,” one former employee said. He said he had seen this happen at least three times.
Reuters was unable to independently verify the incidents or determine how often they may have occurred.
Genpact declined comment.
The working conditions described by the employees offers a window into the moderator operations at Facebook and the challenges faced by the company as it seeks to police what its 2 billion users post. Their account contrasts in several respects with the image presented by three Facebook executives in interviews and statements to Reuters of a carefully selected, skilled workforce that is paid well and has the tools to handle a difficult job.
Ellen Silver, Facebook’s vice president of operations, acknowledged to Reuters that content moderation “at this size is uncharted territory”.
“We care deeply about getting this right,” she said in January. “This includes the training reviewers receive, our hiring practices, the wellness resources that we provide to each and every person reviewing content, and our overall engagement with partners.”
While rejecting the Hyderabad employees’ assertions about low pay, Facebook has said it had begun drafting a code of conduct for outsourcing partners but declined to give details.
It has also said it would be introducing an annual compliance audit of its vendor policies this year to review the work at contractor facilities. The company is organising the first-ever summit in April to bring together its outsourcing vendors from around the world, with the aim of sharing best practices and bringing more consistency to how moderators are treated.
These efforts were announced in a blog post on Monday by Justin Osofsky, Facebook’s vice-president of global operations.
Facebook works with at least five outsourcing vendors in at least eight countries on content review, a Reuters tally shows. Silver said about 15,000 people, a mix of contractors and employees, were working on content review at Facebook as of December. Facebook had over 20 content review sites around the world, she said.
Over a dozen moderators in other parts of the world have talked of similar traumatic experiences.
A former Facebook contract employee, Selena Scola, filed a lawsuit in California in September, alleging that content moderators who face mental trauma after reviewing distressing images on the platform are not being properly protected by the social networking company.
Facebook in a court filing has denied all of Scola’s allegations and called for a dismissal, contending that Scola has insufficient grounds to sue.
Some examples of traumatic experiences among Facebook content moderators in the United States were described this week by The Verge, a technology news website.
Pressure, Lack of Experience
The Genpact unit in Hyderabad reviews posts in Indian languages, Arabic, English and some Afghan and Asian tribal dialects, according to Facebook.
On one team, employees spend their days reviewing nudity and explicit pornography. The “counter-terrorism” team, meanwhile, watches videos that include beheadings, car bombings and electric shock torture sessions, the employees said.
Those on the “self-harm” unit regularly watch live videos of suicide attempts — and do not always succeed in alerting authorities in time, two of the employees said. They told Reuters they had no experience with suicide or trauma.
Facebook said its policies called for moderators to alert a “specially trained team” to review situations where there was “potential imminent risk or harm.”
The moderators who spoke to Reuters said in the instances they knew of, the trained team was called in when there was a possibility of a suicide, but the reviewers continued to monitor the feed even after the team had been alerted.
Job postings and salary pay-slips seen by Reuters showed annual compensation at Genpact for an entry-level Facebook Arabic language content reviewer was 100,000 Indian rupees ($1,404) annually, or just over $6 a day. Facebook contended that benefits made the real pay much higher.
The workers said they did receive transport to and from work, a common non-cash benefit in India.
Moderators in Hyderabad employed by another IT outsourcing firm, Accenture, monitor Arabic content on YouTube on behalf of Google for a minimum of 350,000 rupees annually, according to two of its workers and pay slips seen by Reuters. Accenture declined to comment, citing client confidentiality.
Facebook disputed the pay analysis, saying Genpact is required to pay above industry averages. The outsourcer, while declining to comment on its work for Facebook, said in a statement that its wages are “significantly higher than the standard in the industry or the minimum wage set by law.”
The Genpact moderators in Hyderabad said Facebook sets performance targets, which are reassessed from time to time, that are called Average Review Time or Average Handling Time.
“We have to meet an accuracy rate of 98 percent on massive targets,” one of the moderators told Reuters. “It is just not easy when you are consistently bombarded with stuff that is mostly mind-numbing.”
They said they often took work home on their laptops to keep up.
Silver said handling time was tracked to assess whether Facebook needs more reviewers and whether its policies are clear enough. But she acknowledged some older procedures may have led moderators to feel pressured.
The company also said it was increasing restrictions on workers’ remote access to its tools.