Eli Pariser: How Can We Look Past (or See Beyond) Our Digital Filters? Eli Pariser explains why being trapped in "filter bubbles" is bad for us and bad for democracy. He says we don't get exposed to information that could challenge or broaden our worldview.
NPR logo

How Can We Look Past (or See Beyond) Our Digital Filters?

  • Download
  • <iframe src="https://www.npr.org/player/embed/505586856/505735565" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
How Can We Look Past (or See Beyond) Our Digital Filters?

How Can We Look Past (or See Beyond) Our Digital Filters?

  • Download
  • <iframe src="https://www.npr.org/player/embed/505586856/505735565" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript


It's the TED Radio Hour from NPR. I'm Guy Raz. And on the show today, ideas about reconciliation, especially in such a divided moment in time. And like a lot of us, Eli Pariser has been wondering how to better understand the people he doesn't always come into contact with, especially online.

ELI PARISER: So I think the first thing that everyone thinks of - and me, too - is I'm going to go find, you know, people who think totally different from me...

RAZ: Yeah, right.

PARISER: ...And follow them.

RAZ: Yeah.

PARISER: And that's great if you can handle it. Personally - and I think most people who do that experiment, it's kind of a recipe for madness a little bit because...

RAZ: Wait, have you done that?

PARISER: Yeah, totally. No, I...

RAZ: What'd you do? What'd you do?

PARISER: Oh, I follow a whole bunch of conservatives on Twitter and on Facebook.

RAZ: And you can't handle it? Or, like, it brings you down or drives you crazy or what?

PARISER: It sends me into...

RAZ: Like, a deep, dark place?

PARISER: Not a deep, dark place but, like, a - I can't kind of listen sincerely because I feel like the arguments are being manipulated.

RAZ: Of course, long before the presidential election in the U.S., there was talk of how hard it can be to get outside your own bubble online - to hear other arguments, to understand them, even empathize with them. Eli Pariser knows more about this stuff than most people. He's the founder of the viral content website Upworthy. And he's written a book all about what he calls filter bubbles, how websites like Facebook and Google filter what you see according to all kinds of criteria you can't control. Here's Eli on the TED stage.


PARISER: Your filter bubble is kind of your own personal, unique universe of information that you live in online. And what's in your filter bubble depends on who you are and it depends on what you do. But the thing is that you don't decide what gets in. And more importantly, you don't actually see what gets edited out. So Facebook isn't the only place that's doing this kind of invisible algorithmic editing of the web. Google's doing it, too.

If I search for something and you search for something even right now, at the very same time, we may get very different search results. Even if you're logged out, one engineer told me there are 57 signals that Google looks at - everything from what kind of computer you're on to what kind of browser you're using to where you're located - that it uses to personally tailor your query results.

Think about it for a second. There is no standard Google anymore. And where this - this moves us very quickly toward a world in which the internet is showing us what it thinks we want to see, but not necessarily what we need to see. As Eric Schmidt said, it'll be very hard for people to watch or consume something that has not in some sense been tailored for them.

RAZ: All of this matters more now than ever before, when here in the U.S. and even around the world people just cannot seem to agree on the most basic stuff, let alone more complicated problems.

PARISER: So a fascinating thing is that there was a big debate in this election about crime in the United States. And was crime up or was crime down? And according to crime statistics, crime is not up. We're not in a record era of crime. But, you know, if you google it, you see a mix of more conservative-leaning websites that assert strongly that crime is growing and high. And that's kind of scary because Facebook and Google are orders of magnitude bigger than any newspaper ever has been, than any media institution has ever been.

But I think we still haven't really grappled with, you know, we have a sense of what the ethics of being a newspaper are, but we don't really have a sense of what the ethics of being an algorithm writer are and what our responsibilities are when it comes to this stuff.


PARISER: So one of the problems with the filter bubble was discovered by some researchers at Netflix. What they discovered was that in our Netflix queues, there's kind of this epic struggle going on between our future aspirational selves and our more impulsive present selves. You know, we all want to be someone who has watched "Rashomon." But right now, we want to watch "Ace Ventura" for the fourth time.


PARISER: So the best editing gives us a bit of both. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It gives us some information vegetables, it gives us some information dessert. And the challenge with these kind of algorithmic filters, these personalized filters, is that because they're mainly looking at what you click on first, you know, you don't - it can throw off that balance. And instead of a balanced information diet, you can end up surrounded by information junk food.

RAZ: I mean, we're now in a place where all of a sudden we have access to infinite information, to facts, to truth all the time. So how do we - like, how do we change the way filtering and algorithms work to avoid the information junk food?

PARISER: Yeah. I mean, I do talk to people inside Google and inside Facebook who are very concerned and very interested in how do we actually solve these problems? And I think one of the things that Facebook has done at times is when you have a story that has one point of view, underneath it pop up a related story with a different point of view. That's actually pretty powerful. I think the second piece that's really interesting is one of the most interesting spaces where people have kind of been able to get good cross partisan dialogue going, where they've been able to actually hear each other are spaces where there's some unifying identity that supersedes your political identity.

And the classic example of this is message boards for sports fans, and it turns out that there's really rich conversation happening on sports message boards because we're all, you know, Red Sox fans first, and then I might be a Republican and you might be a Democrat. And having that sense of shared identity allows people to be less defensive and more open to new ideas.


PARISER: So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, then we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable.

And we need the new gatekeepers to encode that kind of responsibility into the code that they're writing because I think we really need the internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it's not going to do that if it leaves us all isolated in a web of one. Thank you.


RAZ: Eli Pariser - you can see his TED talk at ted.com.

Copyright © 2016 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.