Facebook Content Moderators Trained With Video of Man Getting Stabbed to Death

Facebook content moderators sift daily through violent, hateful and graphically pornographic content on the social media platform, making them essential to the Silicon Valley giant's operations, especially in this time of intensive scrutiny of its content. But many are employed through third-party professional service vendors, and work under strikingly different conditions than those employed directly by Facebook.

On Monday, The Verge exposed the traumatizing work performed by thousands of underpaid contractors who moderate Facebook content, including the 1,000 moderators who work for Cognizant, a Facebook contractor based in Phoenix, where the story opens with moderators in training watching a shocking video of a man being stabbed "dozens of times, while he screams and begs for his life."

The Trauma Floor: The secret lives of Facebook moderators in America https://t.co/d1JzzBN5YK by @caseynewton pic.twitter.com/vaFSMkI3oc

— The Verge (@verge) February 25, 2019

It's the job of a trainee, whose name, along with the names of most others in the story, has been changed to preserve her anonymity, to correctly identify the on-video murder as a violation of Section 13 of Facebook's community standards, which bans videos of "dismemberment unless in a medical setting" and "cannibalism" but does allow certain types of "videos that show the violent death of a person or people by accident or murder" to appear behind a warning screen.

While the moderators—called "process executives" by Cognizant—are essential to the operation of Facebook, they don't benefit from the largesse that typically goes with working for a Silicon Valley giant. Similar to the disparity between Amazon employees and the reportedly brutal conditions endured by contractors in the company's warehouses, Facebook moderators don't receive such perks as in-house bike repair, fancy cafeterias, valet parking and generous parental leave, although they do receive health care benefits. While the average Facebook employee makes $240,000 annually in total compensation, the process executives working for Cognizant make $28,800 a year.

Widespread use of contract labor—Facebook hasn't disclosed exact figures, but contract workers outnumber direct employees at other major tech companies, including Google—helps to keep Facebook's profit margin high: $6.9 billion in profit on $16.9 billion in revenue 2018.

In addition to the graphic and traumatizing content contractors are subjected to on the job, their bathroom breaks and accuracy in moderating are micromanaged as they grind through up to 400 flagged posts a day. Employees in Cognizant's Phoenix office are allowed two 15-minute breaks and a 30-minute lunch break per day, plus nine minutes of "wellness time" (to step away from their desks when they feel traumatized), but spend much of their break time waiting in line for the limited number of bathrooms, according to The Verge report.

"This was an issue that comes with a growing company and facility constraints," a Facebook spokesperson told Newsweek regarding the Phoenix facility bathroom shortage. "Once a new space was outfitted, a new set of bathrooms were opened. In addition, this is a place of business. If someone needs to go to the bathroom they can and are given time to go to the bathroom."

Cognizant employees are offered on-site counseling, but only for part of the day, according to The Verge. There are hotline therapy sessions and therapeutic activities such as yoga, but employees who spoke with The Verge found that additional avenues of stress relief were necessary, which can include drugs, on-site sex and pervasive gallows humor. Several moderators described symptoms similar to secondary traumatic stress and PTSD, with one employee describing having a panic attack at a movie theater (the movie was Mother!) months after leaving her content moderator position.

"I'm fucked up, man," one employee told The Verge. "My mental health—it's just so up and down. One day I can be really happy, and doing really good. The next day, I'm more or less of a zombie. It's not that I'm depressed. I'm just stuck."

One counselor, at the Phoenix site, speaking anonymously, proposed that the trauma suffered by moderators may turn into beneficial "post-traumatic growth," as manifested in the strength exhibited by Pakistani education activist Malala Yousafzai after she was shot in the head by a Taliban gunman. "There are many examples of people that experience difficult times and come back stronger than before," the counselor told The Verge.

Beyond the flood of traumatizing pornography, violence and hate, many moderators also found themselves subjected to the same propagandizing influence that has made social media such a hot bed for viral conspiracy theories. Repeated exposure has led some moderators to endorse Flat Earth, 9/11 Truth and various "false flag" theories surrounding mass shootings.

"People really started to believe these posts they were supposed to be moderating," one moderator told The Verge. "They were saying, 'Oh gosh, they weren't really there. Look at this CNN video of David Hogg—he's too old to be in school.' People started Googling things instead of doing their jobs and looking into conspiracy theories about them. We were like, 'Guys, no, this is the crazy stuff we're supposed to be moderating. What are you doing?'"

"The work is not easy. It sometimes means looking at disturbing or violent content," Facebook Vice President of Operations Ellen Silver wrote in a 2018 blog post, which described how the company supported its moderators. "At Facebook we have a team of four clinical psychologists across three regions who are tasked with designing, delivering and evaluating resiliency programs for everyone who works with graphic and objectionable content."

These four psychologists are tasked with helping roughly 15,000 content moderators working for Facebook contractors in 50 languages and at 20 locations around the world, including in California, Arizona, Texas and Florida.

A spokesperson for Facebook emphasized that the majority of content flagged for moderation is "benign in nature."

This article has been updated to include comments from Facebook.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer



To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go