Facebook Removes Iconic 'Napalm Girl' Photo From Its Site : All Tech Considered A little-known team of humans at Facebook removed the iconic photo from the site this week. That move shows how much the company is struggling internally to exercise the most basic editorial judgment.

With 'Napalm Girl,' Facebook Humans (Not Algorithms) Struggle To Be Editor

  • Download
  • <iframe src="https://www.npr.org/player/embed/493454256/493655017" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript


Facebook is being accused of censorship and abuse of power after it banned an iconic photo from the Vietnam War. It's the photo of a naked 9-year-old girl fleeing from a napalm attack. Facebook says they took the image down because it violated the site's standards on nudity. The company is now reposting the photo.

I spoke with NPR's Aarti Shahani about this earlier. She first explained how Facebook got into this mess over what's clearly a historical image.

AARTI SHAHANI, BYLINE: Here is a really important detail. OK, while it is true that Facebook uses automated systems to identify prohibited content - right? - they've got software crawling through the site and looking for nudity and porn videos. In this case, software was not at work. The company says a human being flagged the photo. That means a real-life Facebook user saw it. Maybe it was in their feed. You know, they're friends of the writer or somebody who shared it.

The point is a person saw it and reported it, and inside Facebook, a person, not an algorithm, decided to take it down. Now, this is an extraordinary decision. A photo deemed so significant that it won a Pulitzer Prize did not make the cut for Facebook's community standards, the rules the company set for what's OK.

CORNISH: Has Facebook said who that person was inside the company who made the decision on the photograph?

SHAHANI: No, we have no clue about that. We don't know the name of the person or even what country they're in. And Facebook has a small army of quote, unquote, "safety specialists" - people responsible for removing content. They're scattered all over the world in the U.S., India, Ireland.

The initial decision could have come from a recent college grad with zero background in journalism who saw a stark photo and felt, oh, I've yanked off things like this before and, you know, hit delete.

CORNISH: So where does Facebook go from here? Will the company essentially rethink this kind of black-and-white policy on nudity?

SHAHANI: Well, it's funny you say black and white because they would say - in fact they have told me in past interviews that they are nuanced. They always consider context unless something is clearly illegal. And by the way, I did run this photo in this issue by a lawyer in the European Union. The user who initially posted it was in Norway. And the lawyer says it's unimaginable, legally speaking, that Facebook would have thought this is child porn. So the fact that it was a discretionary decision - that leaves us with two big so what's.

OK, one is, how much does Facebook exercise editorial judgments anyway - a lot, a little, is it arbitrary? The company says it's just a platform, but it's acting like a publisher when it decides to edit and remove things. And number two is, how much do they explain themselves to their stakeholders? Do they adopt best practices that traditional publishers have had when they choose to act and explain to the public, or is it OK to be a black box of sorts?

CORNISH: That's NPR's Aarti Shahani. Aarti, thanks so much.

SHAHANI: Thank you.

Copyright © 2016 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.