Critics Say YouTube Hasn't Done Enough To Crack Down On Extremist Content A year ago, YouTube faced heavy criticism for not taking down extremist content. Since then, the company has revamped its algorithm and hired content moderators. But is the new formula working?
NPR logo

Critics Say YouTube Hasn't Done Enough To Crack Down On Extremist Content

  • Download
  • <iframe src="https://www.npr.org/player/embed/671285261/671285282" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Critics Say YouTube Hasn't Done Enough To Crack Down On Extremist Content

Critics Say YouTube Hasn't Done Enough To Crack Down On Extremist Content

Critics Say YouTube Hasn't Done Enough To Crack Down On Extremist Content

  • Download
  • <iframe src="https://www.npr.org/player/embed/671285261/671285282" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

A year ago, YouTube faced heavy criticism for not taking down extremist content. Since then, the company has revamped its algorithm and hired content moderators. But is the new formula working?

MARY LOUISE KELLY, HOST:

And time now for All Tech Considered.

(SOUNDBITE OF ULRICH SCHNAUSS' "NOTHING HAPPENS IN JUNE")

KELLY: All this month, we've been reporting on toxic content - what it is, what's to be done about it, both questions that YouTube has thought long and hard about.

AUDIE CORNISH, HOST:

For years, the YouTube videos of radical Muslim cleric Anwar al-Awlaki inspired terrorists like the Fort Hood gunman and the Boston Marathon bombers. Last year, YouTube pulled down its propaganda videos. It's been trying to reassure people that it's addressing the problem of extremism on its website.

KELLY: But critics say YouTube, which is owned by Google, has not done nearly enough to prevent extremist videos, such as jihadist or white nationalist propaganda, from being hosted on the platform. NPR's Tim Mak has more.

TIM MAK, BYLINE: Viewers worldwide watch more than a billion hours of YouTube a day. In the midst of all these videos, the platform has struggled to keep extremism out. What's more, it has also struggled against its own systems that suggest to people what they might like to watch.

REBECCA LEWIS: YouTube's algorithms themselves can sometimes lead people down these rabbit holes.

MAK: That's Rebecca Lewis, a researcher for the Data & Society institute.

LEWIS: YouTube as a platform has an incentive to keep viewers on its website. So if you watch a video that is criticizing feminism, then the next video that YouTube may suggest will be one step a little bit more intense than that. So maybe it'll be a little bit more overtly sexist or a little bit more overtly racist.

MAK: YouTube has resisted public scrutiny of these problems. The company declined a request from NPR to be interviewed on tape. Google also declined to send a high-level executive to testify before a Senate Intelligence Committee hearing earlier this year, drawing the ire of the panel's members. Facebook and Twitter sent its COO and CEO, respectively.

MARK WARNER: My frustration level with Google grows virtually every week.

MAK: That's Senator Mark Warner, the top Democrat on the Senate Intelligence Committee.

WARNER: Increasingly, researchers are saying that some of the most disruptive behavior in terms of radicalizing and incenting violence and incenting hate behavior on both the left and the right is actually used with the YouTube platform.

MAK: Marc Ginsberg, an adviser to the Counter Extremism Project, a group dedicated to confronting extremist messaging online, was highly critical of YouTube for not doing enough to remove extremist videos.

MARC GINSBERG: YouTube's management is insensitive, full of hubris, unwilling to be held responsible to the public for its failures to adopt necessary policies and procedures to remove extremist content.

MAK: In a statement, YouTube provided a list of steps that it has taken to address the issue and said it takes, quote, "swift action against terrorism content and content that incites violence."

It said that YouTube had expanded the use of automated machine learning techniques to remove violent extremist content. It also hired 10,000 people to address content that violates its terms.

GINSBERG: We've watched people who've been hired or flagged part-time as consultants, and their ability to stay online and to watch this content burns them out.

MAK: That's Ginsberg again.

Why are people being so burnt out?

GINSBERG: Because they've seen content that is horrific.

MAK: YouTube also says that it has implemented what it calls the redirect method on its platform so that when people search for particularly sensitive keywords, they will be redirected to content that debunks violent extremist messages.

RENEE DIRESTA: They've begun to use Wikipedia, like linking out to anti-conspiracy theory content. That's a very naive approach.

MAK: That's Renee DiResta, a researcher for New Knowledge who investigates the spread of narratives across social networks.

DIRESTA: The best defense is actually preventing people from going down these radicalization pathways in the first place. But that requires the platforms to take a paternalistic approach to what they show. That makes a lot of people uncomfortable.

MAK: The question she asks is, are we comfortable with a more aggressive approach where Google becomes the content police? Tim Mak, NPR News, Washington.

(SOUNDBITE OF MALA RODRIGUEZ'S "LA NINA")

Copyright © 2018 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.