YouTube Removes White Supremacist Content YouTube's new policy intended to remove hateful speech from its platform has been effective, but it is also having unintended consequences.
NPR logo

YouTube Removes White Supremacist Content

  • Download
  • <iframe src="https://www.npr.org/player/embed/731044416/731044417" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
YouTube Removes White Supremacist Content

YouTube Removes White Supremacist Content

YouTube Removes White Supremacist Content

  • Download
  • <iframe src="https://www.npr.org/player/embed/731044416/731044417" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

YouTube's new policy intended to remove hateful speech from its platform has been effective, but it is also having unintended consequences.

LULU GARCIA-NAVARRO, HOST:

YouTube is removing thousands of videos with white supremacist and other extremist content. It's the latest big tech company to tighten restrictions in response to public pressure to clean up the hate speech that permeates social media. NPR national security correspondent Hannah Allam is here with us to talk about the purge. Hey.

HANNAH ALLAM, BYLINE: Hi.

GARCIA-NAVARRO: So it has been a couple of days since YouTube began taking down videos. What have you seen? What kind of effect has it had?

ALLAM: Well, there was no waiting around. This policy kicked in immediately. YouTube videos with extremist content started vanishing - videos that promoted white supremacy, neo-Nazi videos. Some civil rights groups and people who've been targeted for harassment online say it's a step in the right direction, although they also have concerns that it doesn't go far enough or it's impossible to enforce. And on the flipside, there are people who say it goes too far.

GARCIA-NAVARRO: Well, tell me. Give me an example of that. Who's saying it's going too far?

ALLAM: One big example is Steven Crowder. He's a right-wing commentator. His YouTube channel has 3 million or more subscribers. He has a history of offensive language, including repeatedly insulting a Vox journalist's race and sexual orientation. After reviews, YouTube ultimately decided not to take down his videos. They said it wasn't a violation. Got some pushback, but then they decided to demonetize them, meaning he can't profit from them. Of course, he wasn't happy with this, and neither were his fans. And his fans include Senator Ted Cruz, the Texas Republican who, you know, went on Twitter demanding that YouTube, quote, "stop playing God and silencing those voices you disagree with."

GARCIA-NAVARRO: In YouTube's effort to get rid of hate speech and bigotry, it also apparently swept up some unintended victims?

ALLAM: That's right. That was a concern going into this, and it has played out. We've already seen several examples of historical and educational material being removed, things like an educational video used to teach about Hitler in Nazi Germany. Even the Southern Poverty Law Center, which is one of the nation's best-known trackers of extremism, they had one of their videos removed because it included an interview with a British holocaust denier.

GARCIA-NAVARRO: So I guess that brings us to the question of who or what is flagging these videos?

ALLAM: Yes. And the question of enforcement and, you know, weeding out these videos is difficult when YouTube uses a variety of methods. They have automated systems, human monitors. YouTube users themselves can report and flag violations. But like other platforms, YouTube's in a bind. On one hand, it wants to be an open forum for a broad spectrum of ideas. On the other, it doesn't want to be accused of helping to spread extremism and hate that we've seen lead to violence in some cases.

GARCIA-NAVARRO: And, of course, it wants to make money.

ALLAM: Definitely, it wants to make money. Advertisers love YouTube because it's unmatched in its reach. And so, yes, there's definitely the financial angle, as well.

GARCIA-NAVARRO: Some free speech advocates say this restricting of content smacks of censorship.

ALLAM: That's right. It's typically framed as a kind of a free speech issue. But there's also an argument that it's a public safety issue, and that we're in a new era with new and evolving technology that's made it incredibly simple and instantaneous to spread hate and extremism. And when those ideas turn to violence, as we've seen happen in places like Christchurch, New Zealand, it becomes as much about public safety as it is about free speech. And so for now, this debate is in the private sector. But the concern is if these companies fail to do something about it, does the government then come in and start regulating?

And, you know, that raises a bunch of thorny First Amendment questions. And so that's the tension we're seeing, who gets to decide the new rules for a new era.

GARCIA-NAVARRO: NPR's national security correspondent Hannah Allam.

Thank you so much.

ALLAM: Thank you.

Copyright © 2019 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.