Yasmin Green: How Did The Internet Become A Platform For Hate Groups? Extremist groups have co-opted the Internet's connective power to recruit members. Yasmin Green explores how the Internet has allowed extremism to spread, and how technology can combat it.
NPR logo

Yasmin Green: How Did The Internet Become A Platform For Hate Groups?

  • Download
  • <iframe src="https://www.npr.org/player/embed/662627058/663047919" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Yasmin Green: How Did The Internet Become A Platform For Hate Groups?

Yasmin Green: How Did The Internet Become A Platform For Hate Groups?

  • Download
  • <iframe src="https://www.npr.org/player/embed/662627058/663047919" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

GUY RAZ, HOST:

On the show today, ideas about the scope and scale of human innovation and some of its Unintended Consequences. What do you remember about the talk around the Internet in 2006?

YASMIN GREEN: I look back then, and I recall a time when we all used to float around, just thinking that we are doing work that's essentially altruistic. We were just like, we're connecting people to information, to each other. This is going to transform democracies, and it's going to empower populations. And we really didn't think about all of the platforms and apps being developed as tools for everyone who's malicious to just be even more effective.

RAZ: This is Yasmin Green.

GREEN: I'm the director of research and development at Jigsaw.

RAZ: And Jigsaw is a technology company created by Google.

GREEN: So we look at problems like repressive censorship, cyberattacks, online radicalization. And we try to build technology that can help protect people from these threats.

RAZ: Yasmin started out at Jigsaw in 2006. This was a year after YouTube was born and the same year that Twitter was created. But that utopian vision of the Internet that she was just describing - well, it didn't anticipate a platform for trolls and hate groups and extremists to find each other and build a like-minded community. And one prime example - the extremist group ISIS.

GREEN: ISIS has been kind of given the accolade of being, you know, the first terrorist group to really understand the Internet. There was no technological genius in their use of the Internet. There's nothing that ISIS did that was impressive from an innovation perspective. They used the tools that are open to all of us to use for connecting, for sharing, for activism. They used those tools almost in the way that the rest of us use them, but they used them with destructive ends in mind.

RAZ: And so back in 2015, around the time ISIS was recruiting heavily and gaining momentum, Yasmin and her team at Jigsaw wanted to find out how ISIS was so effective. Here's Yasmin Green on the TED stage.

(SOUNDBITE OF TED TALK)

GREEN: So in order to understand the radicalization process, we met with dozens of former members of violent extremist groups. One was a British schoolgirl who'd been taken off of a plane at London Heathrow as she was trying to make her way to Syria to join ISIS. And she was 13 years old. So I sat down with her and her father, and I said, why? And she said, I was looking at pictures of what life is like in Syria, and I thought I was going to go and live in the Islamic Disney World.

That's what she saw in ISIS. She thought she'd meet and marry a jihadi Brad Pitt and go shopping in the mall all day and live happily ever after. ISIS understands what drives people, and they carefully crafted a message for each audience. Just look at how many languages they translate their marketing material into. They make pamphlets, radio shows and videos, in not just English and Arabic, but German, Russian, French, Turkish, Kurdish, Hebrew, Mandarin Chinese. I've even seen an ISIS-produced video in sign language.

It's actually not tech-savviness that is the reason why ISIS wins hearts and minds. It's their insight into the prejudices, the vulnerabilities, the desires, of the people who are trying to reach that does that. That's why it's not enough for the online platforms to focus on removing recruiting material. If we want to have a shot at building meaningful technology that's going to counter radicalization, we have to start with the human journey at its core.

(SOUNDBITE OF MUSIC)

RAZ: Yeah, I mean, given the Internet's virtues, you know, to connect people, would it ever have been possible to prevent bad actors from also taking advantage of those tools?

GREEN: There's a lot of bad stuff that does get stopped. And it's easy and dangerous to say, well, there are good people and bad people because it ends up - the prescription is really punitive technologies or policies, which is, let's suspend people or let's censor people or let's punish people.

And in most of the cases, in my conversations with former, you know, either ISIS recruits or supporters or extremists, is that they were people with almost legitimate questions. And they went down a bad path, but more information, better information, earlier in the process could have steered them into a different direction.

(SOUNDBITE OF TED TALK)

GREEN: Radicalization isn't this yes-or-no choice. It's a process, during which people have questions - about ideology, religion, living conditions - and they're coming online for answers, which is an opportunity to reach them. So in 2016, we partnered with Moonshot CVE to pilot a new approach to countering radicalization called the Redirect Method. It uses the power of online advertising to bridge the gap between those susceptible to ISIS's messaging and those credible voices that are debunking that messaging.

And it works like this - someone looking for extremist materials - say they search for, how do I join ISIS? - will see an ad appear that invites them to watch a YouTube video of a cleric, of a defector - someone who has an authentic answer. And because violent extremism isn't confined to any one language, religion or ideology, the Redirect Method is now being deployed globally to protect people being courted online by violent ideologues, whether they're Islamists, white supremacists or other violent extremists, with the goal of giving them the chance to hear from someone on the other side of that journey.

(SOUNDBITE OF MUSIC)

RAZ: I mean, it sounds like you're trying to kind of break this down with the hope, I guess, of at some point figuring out how to solve this. But this is a long-term project. This is not going to happen overnight, right?

GREEN: When our group was started seven years ago, I remember feeling like, wow, this is a real gamble. And now, you know, I think we have to have more people within technology companies that think about the world through this lens. Like, it's not enough just to focus on your platform and the, you know, the micro-instances that you see.

Like, you have to think about terrorist groups and their goal and their strategies and what they're doing across the whole Internet. And you have to have a big-picture view. We can't be so tunnel-visioned anymore. The more that we do that, the better we'll be at spotting problems early.

RAZ: Yeah.

GREEN: When you go around the world and you see how groups who have power are actually using technology to reinforce their power, you realize that the kind of utopian vision of what the Internet was going to be was not inevitable. We'd have to be proactive and step in if we wanted to have a chance of realizing that.

(SOUNDBITE OF MUSIC)

RAZ: That's Yasmin Green. She's the director of research and development at Jigsaw. You can see her full talk at ted.com. On the show today, ideas about Unintended Consequences. I'm Guy Raz, and you're listening to the TED Radio Hour from NPR.

(SOUNDBITE OF MUSIC)

Copyright © 2018 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.