Inside The Tormented Lives Of PTSD-Stricken, Underpaid Facebook Moderators

Facebook content moderators – some paid just $28,800 per year to sift through an endless bombardment of “rape, torture, bestiality, beheadings, suicide and murder,” have been coping with seeing traumatic images in a number of ways, according to The Verge‘s Casey Newton. 

Illustration by Corey Brickley 

Some moderators get through the day by telling dark jokes about committing suicide. Others smoke weed during breaks to numb their emotions (“Moderators are routinely high at work,” notes Newton). Employees have even been caught having sex in stairwells and a room for lactating mothers in what one employee described as “trauma bonding.” 

“You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off,” one content moderator recently told the Guardian last September. 

The Verge‘s three-month investigation into the troubled lives of a dozen current and former Facebook content moderators centers around Phoenix contractor Cognizant, which employs 1,000 of the Facebook’s 15,000 content reviewers around the world. 

The employees describe the emotional toll that their job takes on them – as they grow distant from loved ones, anxious and isolated – in a workplace “that is perpetually teetering on the brink of chaos.” 

The panic attacks started after Chloe watched a man die.

She spent the past three and a half weeks in training, trying to harden herself against the daily onslaught of disturbing posts: the hate speech, the violent attacks, the graphic pornography. In a few more days, she will become a full-time Facebook content moderator, or what the company she works for, a professional services vendor named Cognizant, opaquely calls a “process executive.”

For this portion of her education, Chloe will have to moderate a Facebook post in front of her fellow trainees. When it’s her turn, she walks to the front of the room, where a monitor displays a video that has been posted to the world’s largest social network. None of the trainees have seen it before, Chloe included. She presses play.

The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking. –The Verge

Some moderators are starting to “embrace fringe views”

After watching countless “conspiracy videos and memes,” some moderators have begun to believe alternate theories about world events, while others have become paranoid. 

One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.” –The Verge

Content moderators are not allowed to bring their cell phones into the workspace, or use writing utensils and paper in case they “might be tempted to write down a Facebook user’s personal information.” Employees are required to put their personal belongings into lockers, while things like hand location are required to be placed in clear plastic bags so that managers can always see them. 

Pushed to perfection

Facebook has mandated that Cognizant and other contractors must strive for 95% accuracy – meaning that when Facebook employees audit the decisions made by content moderators, they agree 95% of the time. Cognizant has never hit that target – usually floating around the high 80% to low 90% range. 

Facebook audits around 50 or 60 weekly decisions out of around 1,500 according to one employee. They are initially reviewed by a second Cognizant employee (a “QA”) who makes $1 more per hour, after which a subset are then audited by Facebook employees to determine the accuracy score. 

Officially, moderators are prohibited from approaching QAs and lobbying them to reverse a decision. But it is still a regular occurrence, two former QAs told me.

One, named Randy, would sometimes return to his car at the end of a work day to find moderators waiting for him. Five or six times over the course of a year, someone would attempt to intimidate him into changing his ruling. “They would confront me in the parking lot and tell me they were going to beat the shit out of me,” he says. “There wasn’t even a single instance where it was respectful or nice. It was just, You audited me wrong! That was a boob! That was full areola, come on man!” –The Verge

Fearing for his safety, one QA began to bring a concealed gun to work, while fired employees “regularly threatened to return to work and harm their old colleagues.” 

“Accuracy is only judged by agreement. If me and the auditor both allow the obvious sale of heroin, Cognizant was ‘correct,’ because we both agreed,” said the employee, adding “This number is fake.

Moderators also have to quickly decipher whether a post containing offensive language should remain on the site or not. 

A post calling someone “my favorite n—–” is allowed to stay up, because under the policy it is considered “explicitly positive content.”

“Autistic people should be sterilized” seems offensive to him, but it stays up as well. Autism is not a “protected characteristic” the way race and gender are, and so it doesn’t violate the policy. (“Men should be sterilized” would be taken down.)

In January, Facebook distributes a policy update stating that moderators should take into account recent romantic upheaval when evaluating posts that express hatred toward a gender. “I hate all men” has always violated the policy. But “I just broke up with my boyfriend, and I hate all men” no longer does. –The Verge

Moderators also complain about an “ever-changing rulebook,” and compare their job to “a high-stakes video game in which you start out with 100 points — a perfect accuracy score — and then scratch and claw to keep as many of those points as you can. Because once you fall below 95, your job is at risk.”

To help sobbing, PTSD-stricken employees cope with the flood of beheadings, rapes, suicides, beastiality and other shocking content, Cognizant offers its employees nine minutes per day of “wellness time,” to be used if they feel traumatized and need to step away from their desks. They are told to cope with job stress by visiting counselors, calling hotlines, and using employee assistance therapy sessions. 

“[S]ix employees I spoke with told me they found these resources inadequate. They told me they coped with the stress of the job in other ways: with sex, drugs, and offensive jokes,” writes Newton. 

Among the places that Cognizant employees have been found having sex at work: the bathroom stalls, the stairwells, the parking garage, and the room reserved for lactating mothers. In early 2018, the security team sent out a memo to managers alerting them to the behavior, a person familiar with the matter told me. The solution: management removed door locks from the mother’s room and from a handful of other private rooms. (The mother’s room now locks again, but would-be users must first check out a key from an administrator.)

A former moderator named Sara said that the secrecy around their work, coupled with the difficulty of the job, forged strong bonds between employees. “You get really close to your coworkers really quickly,” she says. “If you’re not allowed to talk to your friends or family about your job, that’s going to create some distance. You might feel closer to these people. It feels like an emotional connection, when in reality you’re just trauma bonding.” –The Verge

Last September, a Northern California content moderator sued the social media giant after she said she was “exposed to highly toxic, unsafe, and injurious content during her employment as a content moderator at Facebook,” which she says gave her post traumatic stress disorder (PTSD). 

Selena Scola moderated content for Facebook as an employee of contractor Pro Unlimited, Inc. between June 2017 and March of this year, according to her complaint. 

“Every day, Facebook users post millions of videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder,” the lawsuit reads. “To maintain a sanitized platform, maximize its already vast profits, and cultivate its public image, Facebook relies on people like Ms. Scola – known as “content moderators” – to view those posts and remove any that violate the corporation’s terms of use.

The lawsuit also alleges that “Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop … Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled. Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

On Monday, probably in response to The Verge article, Facebook announced new steps it is taking to support content moderators. 

via ZeroHedge News https://ift.tt/2T94S9R Tyler Durden

Leave a Reply

Your email address will not be published.