Copyright ©2010 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

ROBERT SIEGEL, host:

From NPR News, this is ALL THINGS CONSIDERED. I'm Robert Siegel.

MELISSA BLOCK, host:

And I'm Melissa Block.

The Justice Department says there's been an explosion in the trafficking of child pornography. While the government has pledged to step up its investigation of people who download the illegal images, it has hardly been neglecting the problem. Since 2000, the number of federal prosecutions has almost tripled.

As NPR's Martin Kaste reports, that has some asking if more could be done to prevent people from downloading the images to begin with.

MARTIN KASTE: In September of 2004, a family in Northern California got a visit from the FBI.

DAVE: I get a call from my wife. She's crying and upset. And so I went home and there were several agents going through my house, who stated that there was, you know, child porn on one of the computers.

KASTE: This is Dave. We're leaving out his last name because he has the kind of job that could be jeopardized by any mention of child pornography. In this case, the images were on a computer used by his 20-year-old son. His son is now in prison serving a three-year sentence for one count of possessing child pornography. Dave still can't get over the damage that was done by some computer files.

DAVE: We have a young kid that has no criminal history - has no - is not a pedophile, is not a child molester, and he's going to labeled as a sex offender and have to register as such every year on his birthday for the rest of his life.

KASTE: Of course, these aren't just any computer files. Jeff Fischbach is a forensic analyst who works with this kind of material. And he's seen what happens when these images are shown to juries.

Mr. JEFF FISCHBACH (Forensic Analyst): You can watch the expressions change. I saw one gentleman in the front row of the jury, I think he was number three, and I saw him lurch forward toward the defendant.

KASTE: Defendants almost never risk going to trial. Lawyers usually recommend they take a plea bargain. It's Fischbach's job to analyze the hard drives in these cases. And when he does he follows strict protocols, as if the drives contain plutonium.

Mr. FISCHBACH: It is handled like it's radioactive.

KASTE: He keeps the drives in special locked boxes, even when they're running and he never connects them to the Internet or other computers, for fear of cross-contamination. His job has taught him the legal jeopardy that can be caused by even one stray thumbnail image.

And yet, he says, the very same files can be found on the open Internet quite easily.

Mr. FISCHBACH: Oh, right now anybody is just one search term and a click on Google away from most of the same files that I have seen as part of my work.

KASTE: Fischbach has come to believe that these easy-to-find images are a kind of public hazard. He thinks about one case he worked on for a defendant who went to prison because of just one night of ill-advised Web surfing. The easy-to-find images are also tempting weapons in messy custody battles and divorces. He's convinced that in some of the cases he's worked on, one spouse was framed by another.

All of this makes Fischbach wonder why more hasn't been done to block some of the more obvious sources of these radioactive files.

Mr. FISCHBACH: It's the same thing as any other public nuisance. Part of the government's job is not just to go out there and stop people from doing bad things, but to stop good people from having to fall victim to that.

KASTE: It's probably not constitutional for the government to block offending websites outright, but Fischbach says Internet service providers and search engines could voluntarily filter the images that reach their customers, just as email providers filter out known viruses. He's been suggesting this idea for years, and now somebody is trying it.

Mr. JOHN SCARROW (General Manager, Windows Live Safety Services): I have a photo. I'm going to open it so you can see. It's a picture of a truck with a duck.

KASTE: John Scarrow is general manager for Windows Live Safety Services. In a conference room on the Microsoft campus in Redmond, Washington, Scarrow demonstrates something they call PhotoDNA.

Mr. SCARROW: Now, clearly this is not a piece of child pornography. But we've tagged this and created an image as though it was.

KASTE: Scarrow then tries to upload the picture to Microsoft's cloud computing service called SkyDrive. The numeric value of that photo - something called a hash value - is checked against a master list of known child pornography photos that's maintained by the National Center for Missing and Exploited Children. If the photo matches one of those...

Mr. SCARROW: It disabled the account. This error you're seeing is a placeholder error for the fact that this account has been disabled.

KASTE: Microsoft is using a more flexible version of traditional hash values, and that means it can ID an image even if the picture has been cropped or resized. This flexibility takes some heavy math and that means the system can't be used to filter the whole Internet. That would take just too much computing power. But Scarrow says filtering everything was never the point.

Mr. SCARROW: Somebody that's smart enough, that really wants to traffic in the material will figure out how to circumvent these. But the, if you will, the person that is getting a hold of it because it's really easy to get a hold of, you know, is just looking around, whatever, this will have a - we believe, this will have a pretty substantial impact on them.

KASTE: The filtering system is already running on the Bing search engine. If your search finds a site with known child pornography, those links are suppressed.

Microsoft is planning to expand the filter to its other online services, and it's offering to share PhotoDNA free of charge with other online portals. And the project has the support of the Justice Department - tentative support.

Mr. DREW OOSTERBAAN (Justice Department Criminal Division - Child Exploitation and Obscenity Section): I think it's fair to say that it may be part of an answer.

KASTE: Drew Oosterbaan runs the Criminal Division's Child Exploitation and Obscenity Section. He says file filtering could become, in his words, another front in the battle, but he says there could also be unforeseen consequences. One possibility, he says, is that the filtering could work too well. If child pornography became too hard to find, he says, it might create demand for new images and cause more children to be abused.

Mr. OOSTERBAAN: That's why this is a very, very challenging area of law enforcement, where, you know, in the past 10 years, I've never found any real simple solutions.

KASTE: For now, filtering images with PhotoDNA is a pilot project - and it's purely voluntary. One thing its developers insist on is that they don't want to see it evolve into a filtering system that's mandated by the government.

Martin Kaste, NPR News.

Copyright © 2010 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.