SYLVIE DOUGLIS, BYLINE: NPR.
(SOUNDBITE OF DROP ELECTRIC SONG, "WAKING UP TO THE FIRE")
STACEY VANEK SMITH, HOST:
This is THE INDICATOR FROM PLANET MONEY. I'm Stacey Vanek Smith. Social media and the role it plays in our economy, our society and even democracy itself has been the subject of a lot of debate since the mob attack on the U.S. Capitol on January 6. And as we prepare for the inauguration of a new president this week, questions about the regulation of social media companies and how to balance that with free speech and the free market are looming large. Tristan Harris is the president of the Center for Humane Technology. He's also part of a new book of essays called "The New Possible: Visions Of Our World Beyond Crisis." After the break, I talk with Tristan about the role social media plays in our divided country, how we got here and what's at stake right now.
(SOUNDBITE OF MUSIC)
VANEK SMITH: Tristan Harris, thank you for joining us. This week, as we prepare to inaugurate Joe Biden as president of the United States, there are still huge security concerns and just social concerns based on the mob attack on the U.S. Capitol a couple of weeks ago. You study social media and social media companies for a living. What role did you see social media playing in that attack?
TRISTAN HARRIS: You can make quick statements like, you know, social media was used to organize the events of January 6, but that's not what I saw. What I saw was the 10-year culminating process that led to people operating with such certainty about either the election being stolen or the world is run by a bunch of pedophile globalists. We're 10 years into this mind-warping process.
And I think that's what people aren't paying enough attention to. It's not just that there is one actor inciting violence in the form of Trump and de-platforming him, it's that this entire business model profited from giving each of us a more certain view of reality, which, when you give different people a certain view of reality that's incompatible with the other, it just drives up more division and violence.
VANEK SMITH: How does the business model incite actions like this? Because on its face, social media isn't necessarily incentivizing anything except for more people to join.
HARRIS: Well, it incentivizes the clicks, right? And so every time you, you know, swing your finger up and you swipe and it has to show you something, it has to pick what to show you. You might say, well, I picked my friends. I picked the content I clicked on. I picked on what I shared, so therefore, I'm responsible for what I'm seeing. But that's not how it works at all. Behind the glass slab, Facebook has a most powerful supercomputer in the world pointed at your brain and tries to calculate which thing could I show you that would be most likely to keep you swiping as opposed to stopping swiping? And if I show you something that challenges your view of reality, you're not going to stay.
So we've been going through this process for 10 years, where each time you swipe your finger, people get something that confirms affirmation, not information. And then you start associating that with these radioactive topics of - whether it's race or the election or inequality. And that has shattered our shared assumptions. It's not just that we have slightly different sets of facts. We have very, very different assumptions of what's been happening and who did what and whether the balance sheets are equal over the last 10 years.
VANEK SMITH: It was an interesting moment that I think Twitter CEO Jack Dorsey was on an island or something when all this happened. He sort of made the call to ban President Trump from Twitter from this island. And it was kind of an interesting moment of realizing how much power this one individual had. And I believe he himself said this is a bad precedent. I mean, do these companies want regulation? Do you think they're comfortable with the amount of power they have?
HARRIS: You know, both Jack and Mark Zuckerberg have actually said for many years that they shouldn't have this much power and they shouldn't be the ones to decide. You can make of that what you will.
VANEK SMITH: (Laughter).
HARRIS: But I think if we want this moment of de-platforming the president to not be viewed as some kind of power grab by one political side - I don't think that's what's going on here, but I don't think there's a way to prove that to anyone unless there's actually an honest democratic process. You know, imagine if, you know, Nancy Pelosi could just unilaterally wake up on the island of Tahiti one day and say, you know what, I'm going to impeach Trump from Tahiti, right? That's essentially what Jack did, right?
VANEK SMITH: Yeah.
HARRIS: You can't have one person just arbitrarily make decisions on behalf of billions of people. And I think Jack is actually authentically saying that he doesn't think that that's how it should work. And that's why I think we need to draw up a whole, you know, constitutional convention of all sorts of issues. We need a new fairness doctrine for the digital age, for a networked age. There's a whole long agenda of things that we need to cover about how to make the digital sphere operate democratically and for the people's interest, and not just autocratically with tech companies who don't want to basically step into the shoes of responsibility that they now occupy.
VANEK SMITH: Should we treat tech companies, maybe not like traditional companies, but like utilities?
HARRIS: So I think there's two issues. One is governing them, which is kind of a question of utilities. They're so powerful. They're so essential. They're the central information infrastructure that governs our democracy. Obviously, if you're taking over that level of spaces of public squares and public interest, they should be governed according to, you know, what the public interests are. So making them public utilities is a - you know, definitely in the direction of where we wouldn't want to go.
But we also have to ask, beyond the question of governance, is the system itself compatible with a democracy that works? Meaning, we have a system that produces kind of a cultural cancer of shortening attention spans, more addiction, more polarization, more outrage. We can talk about governing that as a public utility, or we can talk about changing it from being a cancer in the first place to something that's not a cancer. What would it look like to have it deepen our concentration, longer attention spans, deeper thought processes, not just a hyperfocus on the present, less polarization, more constructive speech? You know, that's what we think of as kind of humane technology.
VANEK SMITH: I mean, it does seem a little tricky to create regulation around this, honestly, because these are companies. This is dealing with people's ability to communicate with each other in the way that they want. I mean, why not just let these companies make money as they will and let these conversations happen as they will?
HARRIS: Because we won't survive as a civilization.
VANEK SMITH: Whoa. OK.
HARRIS: If we just let the self-driving car of this business model continue, we already saw what happened January 6, culminating 10 years of that process. This is clearly a moment where we cannot survive as a civilization, as a democracy, if we allow that to do. I'm not even sure, frankly, that we can as it is because we've already infected ourselves with the kind of divisive malware where our minds are not able to agree with each other.
Even if you subtract social media right now, we are clearly seeing such different understandings of reality and we're not really interested in abandoning our perspectives. Watching lots of the media as I have the last week, it's not really evident that anyone wants to, you know, open-mindedly say, oh, where might I be completely wrong about my understanding of where the other side is coming from? And that's what really worries me.
VANEK SMITH: Is the difference just the scale? Because what you're talking about with, like, people kind of not hearing each other and not being necessarily open to hearing the opinions of the other side, I mean, that does seem very human to me, you know? Like, much older than Twitter or Facebook.
HARRIS: We've always had partisan media. We've never had supercomputers pointed at our brains perfectly able to stimulate our emotions and know more about us than we know about ourselves. It knows that, you know, resonant frequency, the minor seventh of which anger to put into your nervous system next because it's literally playing chess against 3 trillion calculations of which one will do the best job of that. And so it's checkmate against the human nervous system because it simply has seen more patterns of which emotions, which kinds of text, will tend to produce the results that it wants for its business model. Again, this is possible to escape, but only if we collectively recognize that this is a completely unsustainable, unsurvivable, situation if we don't fundamentally change this business model.
VANEK SMITH: Well, Tristan, thank you so much for talking with me today. I really appreciate it.
HARRIS: Thanks so much for having me.
VANEK SMITH: Tristan Harris is the president of the Center for Humane Technology. He's also part of a new book of essays called "The New Possible: Visions Of Our World Beyond Crisis."
(SOUNDBITE OF MUSIC)
VANEK SMITH: This episode of THE INDICATOR was produced by Dave Blanchard and fact-checked by Sam Sy (ph). THE INDICATOR is edited by Paddy Hirsch and is a production of NPR.
(SOUNDBITE OF MUSIC)
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.