Copyright ©2013 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

ARUN RATH, HOST:

It's ALL THINGS CONSIDERED from NPR News. I'm Arun Rath.

On Thursday, authorities in Canada announced they had busted an enormous international child pornography operation. It was the end of a three-year investigation into a website that trafficked in illicit videos of young boys. Three hundred and forty-eight people have been arrested in connection with the videos, 76 of them in the U.S.

Investigations like this end with press conferences and high-profile trials, but they begin far away from the public eye with one of the most difficult jobs in the world.

NPR's Rebecca Hersher reports. But first, a warning. This story contains content that some listeners may find disturbing.

REBECCA HERSHER, BYLINE: Rich Brown knows how to crack a child pornography case. For decades, he was a cop for the New Jersey State Police. He worked on the Internet crimes against children task force. And part of his job was to look through suspects' hard drives, image by image, which meant he saw some terrible, terrible things.

RICHARD BROWN: You know, there's - I think there's certain things that are more difficult. I do remember seeing the first molestation video. It was a husband that was molesting his two daughters.

HERSHER: When he went home at night, it was difficult for Brown to forget what he had seen.

BROWN: I have two boys, and I remember being ultra-protective of my boys during the time that I was involved with this type of work, and I think that's pretty common.

HERSHER: Brown says this kind of work can be traumatic. Some officers had trouble sleeping or marriage problems. But cops are no longer the only people getting paid to review images of child pornography. The rise of Internet porn has created a shadow industry of people working for tech companies like Google, Microsoft and Facebook. They spend eight hours a day, 40 hours a week looking at pictures and videos and asking themselves, is this child pornography?

SARAH ROBERTS: This is a shadow industry by design. So you're not going to find an association of commercial content moderators publishing statistics and necessarily making themselves known in that way. There's value to the invisibility.

HERSHER: That's Sarah Roberts of Western University. She's one of the few people who systematically studies those who do this work. She says even though the Internet couldn't function without content moderators, very few have heard of them, even within the tech industry.

ROBERTS: A lot of them are under nondisclosure agreements, so they're precluded from speaking to the media, and it is difficult to reach out and find them. And I think there's an aspect of trauma that can often go along with this work, and many workers would rather go home and tune out and not talk about it. So I think the unknown aspect of this work is certainly by design. It's no mistake that it's difficult to find workers who will talk to you about this.

HERSHER: Microsoft and Google both declined to put me in touch with any of the people who review images for their services. Samantha Doerr is the director of child protection at Microsoft.

SAMANTHA DOERR: It's a yucky job. In fact, Microsoft has to invest in, you know, wellness programs for people that work on this.

HERSHER: In March, Microsoft and eight other tech companies came out with guidelines for such wellness programs. They suggest employees take their minds off traumatic content by, for example, taking a 15-minute walk or engaging in a hobby. Many companies have a counseling hotline for employees. But Sarah Roberts says that may not be enough.

ROBERTS: We're also talking about a culture in which your success at your job is directly tied to your ability to stomach this imagery. So if someone were to access these kinds of support services, there may be an implicit suggestion that they're not cut out for the work that they're trying to do for a living.

HERSHER: Doerr says Internet service providers are investing in ways to decrease the number of people who need to do this kind of work, and with good reason.

DOERR: Unlike any other kind of, you know, offensive or illegal content online, the image itself is a crime scene, and every new viewing of that image is a re-victimization of that child.

HERSHER: Scott Rubin is a spokesman for Google.

SCOTT RUBIN: When we become aware of one of those images, we will create this digital signature. It's like a fingerprint. It's a series of ones and zeroes unique to that particular image.

HERSHER: In an effort to automate the process, Microsoft and Google have both created such software. Tagged pictures are sent to a database at the National Center for Missing and Exploited Children, which coordinates with law enforcement. And earlier this year, Twitter also began using the Microsoft program to screen every photo posted to that site.

But despite the software advances, the need for content moderators is still growing. And Sarah Roberts says that workers she's spoken to often feel stigmatized.

ROBERTS: They've said things to me like, you don't want an Internet without our interventions. But at the same time it's exacting a toll on these workers. And because this industry is so new and the need for this work is so new, I think the jury is out as to what the real implications are going to be for these people later on in their life.

HERSHER: And as demand grows, companies are looking to cheap foreign workers to do some of the most difficult work. Rebecca Hersher, NPR News.

Copyright © 2013 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.