Does Instagram Have A Problem With Hate Speech And Extremism? While Facebook and Twitter have come under criticism for the spread of misinformation and conspiracy theories, Instagram has flown relatively under the radar, says Taylor Lorenz of The Atlantic.

Instagram Has A Problem With Hate Speech And Extremism, 'Atlantic' Reporter Says

  • Download
  • <iframe src="" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript


We've had many conversations on this program about how misinformation and conspiracy theories spread on social media platforms like Facebook and Twitter. But one platform used by more than 1 billion people has gone relatively under the radar, and that's Instagram. For those of you who don't know, Instagram is a social media site that is owned by Facebook. On Instagram, users can post pictures, look at pictures from other people's accounts and message other users.

Instagram is increasingly providing a home for hate speech and extremist content. That's according to reporting by Taylor Lorenz, who wrote about this for The Atlantic. She says that Instagram is likely to be, quote, "where the next great battle against misinformation will be fought, and yet it has largely escaped scrutiny." And we should mention here that Facebook is among NPR's financial supporters.

Taylor Lorenz joins us on the line from New York. Welcome, Taylor.

TAYLOR LORENZ: Hi. Thanks for having me.

COLEMAN: Taylor, for people who mainly use, say, Facebook and Twitter, what does Instagram look like? How is it different?

LORENZ: Well, Instagram is primarily a visual platform. So unlike Twitter, where most tweets are made up of text and Facebook, where there's a lot of mix of links, videos, things like that, Instagram is primarily images and video. So you can see content from friends, family, news organizations, meme pages - really anyone.

COLEMAN: So what does extremist content look like on Instagram?

LORENZ: Extremist content on Instagram is essentially just a more visual way of presenting classic misinformation that we've seen on other platforms - so a lot of racist memes, white nationalist content, sometimes screenshots of fake news articles, sometimes people like Alex Jones ranting and promoting conspiracy via YouTube clips that are uploaded to Instagram TV, which is their sort of YouTube competitor. So it can take a lot of different forms.

COLEMAN: Who follows these accounts?

LORENZ: Millions and millions of people follow accounts that post content like this. A lot of these accounts are actually targeted towards younger people. So some of the heaviest engagers on Instagram are teenagers and sort of young millennials. And so a lot of these big right-wing extremist meme pages consider those people their audience. And those are the users that they're targeting.

COLEMAN: How easy is it to, say, follow one account and then get attracted to another account, and then another, and then another that might feature white supremacism?

LORENZ: It's extremely easy. I mean, Instagram actually pushes this and facilitates it. So Instagram is built on a bunch of different algorithms. And one big algorithm that stimulates growth in the site is the page recommendation algorithm. So that's when like - when you follow one Instagram page, you're immediately prompted to follow more pages. So you can follow what's considered a mainstream conservative meme page and you're immediately recommended very extremist content from people like Alex Jones and other notorious conspiracy theorists.

COLEMAN: Now, Facebook has announced that next week it'll begin banning white nationalism and white separatism content on both Facebook and Instagram, which it controls. How did white separatism and white nationalism begin to flourish there in the first place?

LORENZ: All of these platforms have really taken a hands-off approach. They really haven't policed white nationalism or white separatism to the extent that they have other extremist movements. I mean, the New Zealand shooting, I think, was hopefully an inflection point, where it's becoming increasingly clear that they have to crack down on this stuff because not only are they - is this kind of extremist content running rampant on the platforms, these platforms are facilitating its growth.

It's a big problem with Instagram though, as opposed to Facebook and Twitter, is that a lot of these big white nationalist figures, for example, there's a huge cadre of people that are part of the Identity Evropa movement. This is a white nationalist, white supremacist movement. And they're not exactly espousing their ideas on Instagram, but they're normalizing themselves.

So a lot of them are adopting influencer strategies, where they're kind of actually just posting about their lifestyle, posting themselves at nice events, dressed up. And people will follow some of these white nationalist figures, aspire to their lifestyle and then end up becoming introduced to their ideas. You know, they'll go ahead and Google them. They'll start watching their YouTube video. They'll start reading contents on a blog, maybe. So they're more susceptible to that.

COLEMAN: That was Taylor Lorenz. She reports on tech news for The Atlantic. Thank you for joining us, Taylor.

LORENZ: Thank you so much for having me.

Copyright © 2019 NPR. All rights reserved. Visit our website terms of use and permissions pages at for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.