Former Google Ethicist Tristan Harris on Tech's 'Human Downgrading' : It's Been a Minute NPR's Elise Hu steps in for Sam and sits down with Tristan Harris, a former design ethicist for Google, while listeners share their tech burnout stories and solutions. We also hear from WIRED senior writer Nitasha Tiku on what regulation is happening in the tech industry right now.

How Tech Hijacks Our Brains, Corrupts Culture, And What To Do Now

  • Download
  • <iframe src="" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript


From NPR, IT'S BEEN A MINUTE. I'm Elise Hu, in for Sam Sanders. Today on the show, we're talking attention. We asked listeners to write in about how their tech addictions are affecting their lives.

JORDAN PERRY: I was just spending a lot of mindless time scrolling through and scrolling through and scrolling through and...

PAUL COPELAND: Reddit, Twitter, Facebook...

PERRY: ...Scrolling through.

COPELAND: ...And then also watch TV at the same time.

PERRY: But I couldn't just delete my Facebook and Twitter and Instagram accounts.

ANDREA GONOR: I would go to bed quite early. But I would still be in bed with my phone...

COPELAND: And the only time that I didn't have a phone...

GONOR: ...For, like, hours.

COPELAND: ...Was when I was sleep.

HU: That's Jordan Perry (ph), Paul Copeland (ph) and Andrea Gonor (ph). Each of them basically staged a tech intervention on themselves to break their habits. Jordan's solution was to take control of her Facebook feed.

PERRY: And I unfollowed every single one of my friends. This took about half an hour, 45 minutes when I finally did sit down to do it. It has been really life-changing for me to have so much more control about what I see and when I see it.

HU: Andrea bought a lockbox - not a digital one, an actual lockbox - to combat her addiction.

GONOR: So I've put a reminder on my phone to (laughter) lock it up at 11:30. So I put the phone there at 11:30, and then I lock it up.

HU: Paul decided to do something seemingly obvious but difficult in practice.

COPELAND: I go for walks. I go outside. I take photos. I find analog hobbies that allow me to express myself without tapping in front of a computer screen.

HU: These are one-off solutions that worked for Jordan, Andrea and Paul. But they and the rest of us who are always connected are part of a larger system designed to hook us.


TRISTAN HARRIS: You know, turning our phones into slot machines, turning Tinder and dating into slot machines, our email into a slot machine - we check our email 74 times a day. And it's all in the name of getting our attention.

HU: That is Tristan Harris. He's trying to push back against an entire industry that's out for our attention. The business model is simple - the more attention you give platforms, the more ads they can sell and serve you, the more money they make. Harris knows this economy well. He's a tech design ethicist who used to work at Google. Tristan says that while a common worry with technology was that it would outsmart us - you know, replace our jobs, overcome human strengths - he says something more dangerous is already happening.


HARRIS: I want to claim to you today that this point being crossed is at the root of bots, addiction, information overload, polarization, radicalization, outragification (ph), vanityification (ph), the entire thing, which is leading to human downgrading.


HU: Human downgrading - two words Tristan his coined for this problem. He argues that talking about it like this can help us find a solution. So have these tech companies really done this - turned us into a polarized, vain, detached people - or did we do it to ourselves? How do we take back our attention? Do we even have the power to do that? I sat down with Tristan Harris to talk through some of these questions. Later we'll also hear from a journalist who covers Silicon Valley on whether efforts like Tristan's could work after the break.


HU: We're back. You're listening to IT'S BEEN A MINUTE from NPR. I'm Elise Hu, in for Sam Sanders.


HARRIS: Thank you all so much for coming. Can you hear me? Yeah.

HU: Last month, Tristan Harris got many of the biggest investors and creators from Silicon Valley, even celebrities, together at the San Francisco Jazz Center to consider the moment we're at.


HARRIS: So if you ask people, what's wrong? What are we trying to fix here? What we wanted to do was say, OK, how can we get behind a common understanding of what's actually the problem, and how can we fix it?

HU: When Tristan was at Google, he was tasked with making sure the products the company made were not harmful. But he found that a company's good intentions aren't good enough. When he realized he couldn't make change from the inside, Tristan left Google in 2013. Now he's trying to raise attention around the effects of our dependence on social platforms.


HARRIS: It works better when you sit there wanting to up your follower account, when you sit there wanting to get more likes, when you sit there wanting to get more attention for what you look like. And we spend now about a fourth of our lives on the screen in these artificial social systems.

HU: Tristan now has his own nonprofit, the Center for Humane Technology. And he's calling for a larger conversation about reforming these artificial social systems. But it means getting the companies to first take seriously the social costs of what they've created.

HARRIS: Hey, Elise.

HU: Hey, good to see you again.

HARRIS: Good to see you again. Hey.

HU: Thanks for doing this.

HARRIS: Yeah, that's why I'm here.

HU: I sat down with Tristan in San Francisco. We started by talking about what this technology is doing to our brains.

HARRIS: Technology is overpowering our ancient paleolithic brains. So if you think about what was different about Adobe Photoshop or Microsoft Word or the original Macintosh in 1984...

HU: What?

HARRIS: It didn't play on human weaknesses. It didn't say, let me show you photos of your friends liking that Photoshop thing that you just made, or when you hit the paint tool versus the drawing tool, it notified 20 friends that you did that. We didn't play with people's social psychological emotions. And what's wrong with technology is that it's specifically designed to - because of the race to hijack human attention, the race to the bottom of the brain stem, it's designed to get basically lower and lower into what will get your attention, which means I play into your reptilian brain, your lizard brain to get the dopamine out.

That's not enough to get your attention. I have to get you addicted to getting attention, so I start adding those number of followers that you have. And your friends have more followers than you. And don't you want to be like them? And don't you want to get more attention from your friends? And don't you want to start that YouTube channel so you can get more attention? And YouTube controls how much views you get on that video when you post it and give you a burst at the start. Then they'll hold some back, so you come back and check 20 more times.

That entire ecosystem is leading to shortening of attention spans, more social isolation, more vanity and micro-celebrityification (ph) of the world, more outrage because outrage works better in the attention economy than non-outrage. More outrage work leads to more polarization because saying outrageous thing leads people to fall on both sides of walls and yell at each other about it, misinterpreting each other's thoughts. That obviously just downgrades our civility, our common ground.

And so this is actually a connected system that we call human downgrading. And human downgrading is like the climate change of culture because it's like turning up the dials on these little tiny features of our lives and watching the 2-billion-person ant colony go from being normal, just doing what ants do, to suddenly going crazy. And that, we were trying to show, is what was happening in technology to people.

HU: Why is the economy - why does the economy, or why do tech platforms that are building this - why are they favoring attention? Like, why is that the metric?

HARRIS: Well, the interesting thing was - Herbert Simon wrote in the 1970s that when information becomes abundant - so we went from this world. We didn't have much information available to people. You had to, like - it was a competition for who could get the information that someone else didn't have. But suddenly information exploded, and everyone has access to all this information.

So the - when an information becomes abundant, attention becomes finite because attention is this tiny little resource, this little mouth that has to gobble up information. But it's the choke point. And there's only so much human attention out there. It takes nine months to grow new human attention. And so what happened in the attention economy (laughter) is that, you know, it was so big.

And then suddenly that's not enough attention, so we had to double the size of the attention economy. How we do that? Oh, I know. Let's get people to multitask. So they're paying attention to two screens at once or two windows at once. They're watching a YouTube video, and they're doing their email.

HU: Or 14 tabs.

HARRIS: Or let's do 14 tabs. Let's quadruple, quintuple, you know, the size of the attention economy. So the problem is we're fracking people's brains to split their attention into less and less valuable chunks to get more attention out of them. Not because anyone wants this to happen or anyone is evil or malevolent, but because that's what the game theory - they're competing against each other to get this limited supply of attention.

And this attention economy is the substrate for all other economies. It's the substrate for culture, for family life, for politics, for elections, for children, for mental health. It's the thing that's underneath our ability to do everything else because it's the choke point of how we make sense of the world and how we make choices. And if we debase or break that, then we can't work on climate change. We can't work on agreeing on shared agendas of solving the problem. We can't do any of those things. So this is a mission-critical existential problem that we have to solve.

HU: What can we personally do? Or is there anything we can personally do, given this...

HARRIS: Well, there's lots of things people can do personally. The important thing to recognize is that even if you perfectly set your screen colors to gray, and you turn off all those notifications, and it doesn't buzz anymore, and you lock the phone in your bedroom in a cabinet overnight, you still live in a country whose elections will be decided by these forces.

You still go to, you know, send your kids to school with other kids who will continue to use social media and will make choices about wanting to go places so they can take their Instagram photo to look good to other people. And you are going to get sucked into that if you're a developing, you know, teenager. So that's important because you have to recognize that, like climate change, it's a system. And that system has impacts for everyone no matter what we do.

For us personally, you can make sure that you only get notifications when a human being wants your attention. Most notifications that come in are because a machine is calculating, what's the perfect thing I could show you now that might get you to come back, and instead say, hey, I only want messages when a human being sends me an actual text that's just directed towards me. And everything else, there's no reason why that should come in.


HU: So we can take steps as individuals to be more conscious of where our attention goes. But he says it's a bigger problem that needs systemic solutions. What shape would they take? Can the giant tech companies that make billions of dollars in this attention economy actually change?

NITASHA TIKU: I think the first-order problem is centralized power. You know, people refer to tech companies as ungovernable. And I don't think that is because, you know, they're not - you can't come up with a policy that will apply to them. They just have so much power and control.

HU: Coming up, we'll speak with Nitasha Tiku, a senior writer at WIRED, to talk through the ideas to rein in what's happening. That's after the break.


HU: We're back. You're listening to IT'S BEEN A MINUTE from NPR. I'm Elise Hu, in for Sam Sanders. Tristan Harris says our socially linked technology platforms are downgrading us. To better understand what's happening in the broader tech industry, I called up WIRED senior writer Nitasha Tiku. She's been following Tristan's work for a while. And she's noticed a trend within the tech companies he's calling out.

TIKU: I think the first-order problem is centralized power. You know, people refer to tech companies as ungovernable. And I don't think that is because, you know, they're not - you can't come up with a policy that will apply to them. They just have so much power and control. And so I think something like antitrust or, you know, kind of unwinding mergers and acquisitions - if you think about it in a very - you know, in a more concrete way, Facebook.

Facebook's power comes not just from Facebook, obviously, but the fact that it controls WhatsApp and Instagram, and it has made a number of acquisitions that allow it to see anytime another potential competitor is becoming more powerful. And then they have the money and the power to - you know, to acquire that company. So I think making companies follow our existing antitrust laws, existing laws around competition is one way to start.

HU: What kind of moment would you say we are in right now with the industry? Are more voices inside or from inside like Tristan Harris', are they speaking up and calling for the policing of themselves?

TIKU: Right. I think that there's two categories of people speaking out. There are the people like Tristan who, you know, have left the industry, and they're able to call attention to certain things. And they're focused more on the wider impact of how the decisions made inside tech companies are affecting consumers.


TIKU: Then we've also seen a string of worker activism that is talking about how tech companies are, you know, treating their own workers and using their power internally. And now I think we're seeing that people just aren't sure how - you know, like, what's going to be a successful lever of power on Facebook. Like, who will they listen to?

HU: Yeah.

TIKU: You know, they have a monopoly in a way. They don't need to listen to their consumers. Regulators are - you know, it's been two years now - right? - at least of the tech backlash, and we haven't seen anything that's been a real check on their power. So I think workers and whistleblowers, you know, they've been told, you have sway over your employers in a way that nobody else does. So how are you going to use it? That's what we're seeing right now.

HU: Let's talk about those checks on power, the regulation aspect that you already referred to. Entrepreneur and Facebook co-founder Chris Hughes made waves just a couple of weeks ago saying he wants to break up Facebook. And Tristan Harris told me this.

HARRIS: In the U.S., we used to have something called the Office of Technology Assessment that did used to do studies, neutral studies, nonpartisan studies on new technologies and try to govern them because things were coming out. It was actually one of the things that was scrapped in the Newt Gingrich era of congressmen who are trying to make government thinner.

This, especially in the age of exponential tech, with CRISPR coming, with AI coming, with brain-computer interfaces coming or just with the social media products that are already, you know, having dramatic effects on the fabric of society, many of which are very negative that we know about, why not have that back? Right? We have an FAA to manage the Federal Aviation Administration, to manage the common air traffic control. Why can't we have attention and media traffic control in a global information and attention environment?

HU: Why can't we?

TIKU: Well, I think that comparing tech companies to industries past is one of the most successful ways to kind of break through this image that they have made of themselves as these, like, progressive change agents, you know, not even really companies, more like movements, you know, more like forces for democracy. But I think one thing that is sometimes missing from the call for more regulation is the fact that tech companies have a huge influence on even these debates about regulation.

You know, it's also the way that they wield their, you know, kind of soft power. A lot of times, the nonprofits that you have come and testify before Senate, they're funded by these tech companies. You know, and I don't think that they're - it's like a quid-pro-quo relationship. But if they're the ones funding these debates, it just changes the nature of the debate - right? - and what we're talking about.

HU: So if it isn't fines from government agencies like the FTC, then how feasible is change from, say, Congress? Is anyone proposing serious legislation? Because it seemed from a lot of the hearings that we've seen that many members of Congress, many lawmakers don't even seem to understand the technology itself, let alone how to force structural change. Does all of this seem futile?

TIKU: You know, it does often seem futile. But I would say that the argument that they don't understand the tech companies, I think it's really in their interest to make it sound like, you know, ad targeting, microtargeting, the way advertising-based business models work is so kind of complex and un-understandable that, you know, any regulator, they're just going to put foolish roadblocks in place.

You know, these regulators are also looking at biotech. You know, they're also looking at nuclear power plants. I don't think that this stuff is un-understandable. I think that the people who've been explaining it to them, you know, haven't been candid in their responses. It's been extremely difficult to get information out of them. You hear the same thing from German regulators and nonprofits in, you know, Nigeria and also, you know, congressional aides here.

HU: So who is the onus then to force change really on? Is it on the tech companies themselves? Is it on policymakers? Is it on all of us as civil society? If we were to see change, where would we see the pressure come from?

TIKU: I think that obviously regulators. I think civil society. I think it's important to look at these as holistic problems. The one thing I would say is that looking at consumers is incredibly unfair in the current situation because, you know, people say, well, look, nobody is leaving Facebook, or nobody is leaving Google, or - you know, that is the way to measure whether consumers are fed up. And they don't really have a choice.

So I think, you know, potentially asking consumers different questions. Look at actually all of the potential options at their disposal. Like, would you want tech companies to retain less information about you, you know, collect less information overall? These things do not have to be as complicated as they are. It's just a total lack of transparency into how your information is being used and how these decisions are being made.

HU: Meanwhile, you mentioned before we started taping that a lot of these problems that you have been seeing as a tech reporter the wider public is aware of. So what's next? You know, what do you think happens next, given the fact that we are more and more aware of what's happening to us?

TIKU: I think that the conversation and the language is getting more sophisticated. You know, one thing Tristan is really right about is having the ability to describe what's happening to us is invaluable. You know, I often try to think about it like behavioral targeting and advertising with regular advertising. You know, we understand, like, subliminal advertising in TV, or we understand, you know, when they put gum at - you know, at the grocery counter what they're after, even, like, how menus make you, you know, go for the second-cheapest wine always.

And we just don't have any concept, any shared language about how to understand the ways that we're being manipulated by tech companies. So I think that that is a great place to start. You know, we're just so far behind that awareness is useful, you know, with consumers, with regulators and even with the media.

HU: Nitasha Tiku from WIRED, thank you so much.

TIKU: Thanks for having me.


HU: Whether you want to call it human downgrading or manipulation by centralized tech powers, we are all reckoning with it, at least for as long as it has our attention. Thanks again to Nitasha Tiku, senior writer at WIRED. Also thanks to Tristan Harris, co-founder of the Center for Humane Tech. Tristan will have a new podcast out all about human downgrading and how to push for change. It is called "Your Undivided Attention," and it comes out June 10.

This episode was produced by Anjuli Sastry and edited by Jordana Hochman. I'm Elise Hu, filling in for Sam Sanders. I'll be back in your feeds on Friday with the Weekly Wrap. Talk to you then.

Copyright © 2019 NPR. All rights reserved. Visit our website terms of use and permissions pages at for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.