Citizen Evidence Lab Separates Truth From Fiction In Viral Videos

When a grainy video of human rights abuse goes viral, how do you know it's real? NPR's Scott Simon speaks with Christoph Koettl, of Citizen Evidence Lab, which helps users verify videos and photos.

Copyright © 2014 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

SCOTT SIMON, HOST:

With each new conflict, there seems to be images of alleged human rights abuses posted on Facebook, Twitter, blogs and websites, where they can ricochet around the world. But how does a viewer know if they're real? This month, Amnesty International launched the Citizen Evidence Lab, a site to help users try to determine the veracity of those images. We're joined at our studios by the lab's founder, Christoph Koettl. Mr. Koettl, thanks so much for being with us.

CHRISTOPH KOETTL: Thanks for having me.

SIMON: How does this work? Someone see a video, and what do they do?

KOETTL: The website provides you very, very clear guidelines how to assess the video. There are a few very simple steps that everyone of us can do in a few minutes to check, is this new footage? Is this old footage? Has this footage been used before? Does it come, maybe, from a potentially completely different conflict at all.

SIMON: Yeah. So even if the image is real, it could be something that happened five years ago or in a country 2000 miles away.

KOETTL: Exactly. And what you just described is really the core problem. We don't see a lot of technically manipulated footage. But what we see a lot is that footage is shared from a different conflict or it has been used, already, a year before. So it's something that happened already, a long time before.

SIMON: What are some of the telltale signs people should look for? Or some of the first questions they should ask might be a better way to put it.

KOETTL: So I think the first question people should ask - is this actually new footage? And there's some really easy steps that you can take to do that sort of work. So you can do what we call a reverse image search. So you see has this image already been used before? Has this video already been used before somewhere on the Internet? So that's one of the first things you do. But then it really comes down to three specific things. What is the source? Can you confirm the date? Can you confirm the location? And there are a lot of red flags that we look for. So for example, somebody uploads a new YouTube video on a completely new YouTube account that was created the day before. That might be a red flag that this account was just created to promote the specific video, which might be misinformation.

SIMON: There's a famous picture of white phosphorus coming down in Gaza.

KOETTL: Yes. It's a really, really dramatic picture. That is a picture from January, 2009. And somebody posted it in the context of the current conflict and claimed that this is what happened today. And in a very tense situation, as we see between Hamas and Israel, posting this sort of material really can lead to a further escalation. So that's why it's really important be able to very fast say, like, this is old footage. That doesn't, of course, diminish the actual human rights violation, but it's important to say right away, this is not what happened today.

SIMON: What if somebody knows which one of your algorithms they have to meet and decides to concoct something theatrically but makes certain it has the right timestamp?

KOETTL: Yes. So that's a great question, I think. So that's why I also wanted to build a resource that is online, so we can adapt it because if you look at the other side, people adapt their tactics as well. I can give a very concrete example. Somebody posted a picture of the conflict in Ukraine. It was of a picture of a destroyed tank. But actually, it was a picture of a destroyed tank from the Georgia-Russia conflict in 2008. And they flipped the picture, so it was much more difficult to detect it. So that's something we have to integrate into our work. That's why it's important to have an online website where you can, like, expose these sort of tactics.

SIMON: What do you hope the Citizens Evidence Lab can accomplish, ultimately?

KOETTL: I think that's really the most important question that we have to discuss here. We want to document human rights violations. And we have to have credible evidence. And the ultimate goal is to bring perpetrators to justice. So this sort of video footage and pictures give us a completely new level of detail. So you see specific military units. You see specific, individual perpetrators. That will be crucial to ensure individual criminal justice, and that's the ultimate goal of our work.

SIMON: Christoph Koettl is founder and editor of Amnesty International Citizens Evidence Lab. Thanks very much.

KOETTL: Thanks for having me.

Copyright © 2014 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.