The Working Lives Of Facebook's Content Moderators NPR's Scott Simon talks to The Verge's Casey Newton, who reported on the mental health costs to social media content moderators in the U.S., who spend hour after hour monitoring graphic content.
NPR logo

Propaganda, Hate Speech, Violence: The Working Lives Of Facebook's Content Moderators

  • Download
  • <iframe src="https://www.npr.org/player/embed/699663284/699663285" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Propaganda, Hate Speech, Violence: The Working Lives Of Facebook's Content Moderators

Propaganda, Hate Speech, Violence: The Working Lives Of Facebook's Content Moderators

  • Download
  • <iframe src="https://www.npr.org/player/embed/699663284/699663285" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

SCOTT SIMON, HOST:

Facebook has pledged to do better at moderating content. The social media company usually employs third-party contractors to do the job. The average moderator makes about $28,000 a year. Meanwhile, the average Facebook employee's salary is around $120,000 a year. And we want to note here that Facebook is a financial supporter of NPR. In a recent article by Casey Newton for The Verge, moderators employed by one of those contractors, Cognizant, talked about the stress of their jobs - not only low pay but high-pressure working conditions and the emotional toll of monitoring hour after hour of graphic content and conspiracy theories. Casey Newton, Silicon Valley editor at The Verge, joins us now from New York. Thanks so much for being with us.

CASEY NEWTON: Thanks for having me, Scott.

SIMON: Well, help us understand how a lot of these employees live during the workday.

NEWTON: Well, every piece of content that gets reported on Facebook needs to be evaluated to see if it breaks the rules or not. And if a moderator makes the wrong call more than a handful of times during the week, their job could be at risk. And so the folks that I spoke with said that they're just under tremendous pressure to try to get it right even though Facebook is changing those guidelines on a near daily basis to account for some nuance. And, of course, a lot of that content they're looking at is extremely graphic or disturbing. And so many of the folks that I spoke with were struggling with mental health issues months after they left the job.

SIMON: Because they have to see so much?

NEWTON: That's right. You know, there are people in the world who spend a lot of time just sort of uploading the worst of humanity onto Facebook. So almost everyone I spoke with could vividly describe for me at least one thing they saw that continue to haunt them.

SIMON: And it sounds as if during their workday, there's not a lot of time to reflect. There's not even really time to go to the bathroom.

NEWTON: That's right. One of the things that surprised me most about this story was that the moderators' time is managed down to the second. Every time they want to use the bathroom, they have to click a browser extension to let someone know that they're leaving. They also get nine minutes a day of something called wellness time, which they're supposed to use if they see something really traumatizing and need to stand up and walk away. But many of the folks that I spoke with said that wasn't really adequate to kind of emotionally process what they were seeing.

SIMON: What about the effect of seeing so many conspiracy theories?

NEWTON: Well - so this was maybe the thing that surprised me the most from my reporting was the majority of the people that I spoke with said that the longer they looked at the kind of fringe conspiracies that get posted on to Facebook, the more they found themselves sympathetic to those ideas. So I spoke to one man who told me that he no longer believes that 9/11 was a terrorist attack. I talked to someone else who said they had begun to question the reality of the Holocaust. And in some cases, these folks knew sort of how wrong that sounded. But they just kept telling me these videos are so persuasive, and we see them all the time.

SIMON: Let me share with you some words we got from Facebook, knowing we were going to interview you. We work with our partners to ensure they provide competitive compensation starting at $15 per hour, benefits and a high level of support for their employees. They went on to say that they will regularly audit their partners. They'll try to make working conditions and salaries uniform. And they're going to hold a summit on those issues and talk to employees. How do you react to their statement?

NEWTON: Well, I'm glad to hear that Facebook is taking these issues seriously. I would say if they're looking for suggestions, I'm happy to offer two. One would be to pay these folks more. And I think that would be a great place for Facebook to start when it came to compensating employees, who, in many cases, are being asked to evaluate essential questions of speech and security. They're policing the terms of our public debate. That feels like a $60,000-a-year job to me. And then the second thing they could do is just not make these employees have to raise their hand every time they want to go to the bathroom. Just treat these employees the way they treat any Facebook executive, and let them manage their own time.

SIMON: Casey Newton at The Verge, thanks so much for being with us.

NEWTON: Thank you, Scott.

Copyright © 2019 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.