Max Fisher on how social media manipulates our emotions : Consider This from NPR Social media platforms have helped fuel political polarization and incitements to violence across the globe, from the Rohingya genocide in Myanmar to the January 6 insurrection at the U.S. Capitol.

This is because algorithms consistently select content that evokes anger and outrage from its users to maximize engagement. And sometimes, those extreme emotions turn into extreme actions.

New York Times reporter Max Fisher took a deep dive into the impact of social media in his book, "The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World." He shares with us how platform leaders have prioritized profit and growth over safeguards and how the polarizing effect of social media is only speeding up.

In participating regions, you'll also hear a local news segment to help you make sense of what's going on in your community.

Email us at considerthis@npr.org.

Does Social Media Leave You Feeling Angry? That Might Be Intentional

  • Download
  • <iframe src="https://www.npr.org/player/embed/1122786134/1200109599" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

ARI SHAPIRO, HOST:

Back when the Syrian refugee crisis was at its peak, a couple of researchers wondered why some German towns saw a lot of violence against refugees and others did not. So Karsten Muller and Carlo Schwarz decided to look closely at every anti-refugee attack across Germany from January 2015 to February 2017. It was thousands of cases in all. They analyzed the communities where the attacks took place using all kinds of variables trying to find a pattern. They looked at wealth, demographics, political leaning. Nothing stood out. And then, they hit on something.

MAX FISHER: The places where Facebook usage - not general internet usage, but specifically Facebook usage - was significantly above the average for Germany, the number of attacks on refugees was also well above the average.

SHAPIRO: That's author Max Fisher, who writes about this research in his new book, "The Chaos Machine." It's not just that violence against refugees went up in places where people used Facebook a lot. The researchers also looked at outages - Facebook disruptions - and they found that when the platform went offline in a specific place, attacks against refugees in that community dropped.

FISHER: Extended time on social media is addictive, and it changes your behavior, and it changes the way that your mind works. And it does that in a consistent direction towards more outrage, more extreme ideas and a greater hatred of us versus them.

SHAPIRO: By now, we have evidence of this from all over the world. People have studied the role of social media in cases of ethnic violence and even genocide. In Myanmar, people used WhatsApp to organize massacres against Rohingya Muslims. And in the U.S., January 6 insurrectionists have said that YouTube played a role in shaping their extremist ideas.

(SOUNDBITE OF ARCHIVED RECORDING)

FRANCES HAUGEN: My fear is that without action, divisive and extremist behaviors we see today are only the beginning. What we saw in Myanmar and are now seeing in Ethiopia are only the opening chapters of a story so terrifying no one wants to read the end of it.

SHAPIRO: Frances Haugen was a data scientist at Facebook. And here we should mention that Facebook's parent company, Meta, pays NPR to license NPR content. Haugen testified before the Senate in October of last year, saying the company consistently valued profit and growth over protecting its users.

(SOUNDBITE OF ARCHIVED RECORDING)

HAUGEN: The result has been more division, more harm, more lies, more threats and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people.

SHAPIRO: A former security chief for Twitter made similar accusations in a hearing before Congress just today. Disinformation on social media has been a huge problem in the U.S. as well as overseas. During the pandemic, Cincinnati pediatrician Nicole Baldwin said she saw people dying unnecessarily because of conspiracy theories they picked up online.

NICOLE BALDWIN: I think the challenge is you can say almost anything that you want to say on social media. You can claim expertise in a field, whether or not you have it. And the algorithms on these channels push out content that people are looking at. So it's terrifying. It's frustrating.

SHAPIRO: On almost every giant social media platform, algorithms push users to stay longer, getting sucked into echo chambers. And that drive to hold people's attention is something tech executives openly discuss. Here's how Jack Dorsey put it in a congressional hearing last year when he was head of Twitter.

(SOUNDBITE OF ARCHIVED RECORDING)

JACK DORSEY: Ultimately, we're running a business. And a business wants to grow the number of customers it serves.

SHAPIRO: All the biggest social media companies have been accused of sacrificing people's safety for profit. Tech executives often respond to the criticism in similar ways. They say people are responsible for their own actions. When members of Congress asked if Facebook bore some responsibility for the January 6 insurrection, here's how CEO Mark Zuckerberg replied.

(SOUNDBITE OF ARCHIVED RECORDING)

MARK ZUCKERBERG: I think that the responsibility here lies with the people who took the actions to break the law and take - and do the insurrection. And secondarily, also the people who spread that content, with repeated rhetoric over time, saying that the election was rigged and encouraging people to organize, I think that those people bear the primary responsibility as well.

(SOUNDBITE OF MUSIC)

SHAPIRO: CONSIDER THIS - social media is not only shaping our view of the world, it is shaping world events, and we're only beginning to fully understand how.

(SOUNDBITE OF MUSIC)

SHAPIRO: From NPR, I'm Ari Shapiro. It's Tuesday, September 13.

It's CONSIDER THIS FROM NPR. If you feel like checking social media leaves you angrier and more outraged, that's not your imagination. And reporter Max Fisher says the polarizing effect of social media is only speeding up. Here's how he explains it in his book, "The Chaos Machine."

FISHER: (Reading) Remember that the number of seconds in your day never changes. The amount of social media content competing for those seconds, however, doubles every year or so, depending on how you measure it. Imagine, for instance, that your network produces 200 posts a day, of which you have time to read about a hundred. Because of the platform's tilt, you will see the most outraged half of your feed. Next year, when 200 doubles to 400, you will see the most outraged quarter; the year after that, the most outraged eighth. Over time, your impression of your own community becomes radically more immoralizing, aggrandizing and outraged. And so do you. At the same time, less innately engaging forms of content - truth, appeals to the greater good, appeals to tolerance - become more and more outmatched, like stars over Times Square.

SHAPIRO: Whether it's Facebook, YouTube or Twitter, artificial intelligence is deciding what to show us. When I talked with Fisher, I asked him, why is it that on the whole spectrum of things that algorithms on platforms could show us, the things they choose are outrageous and polarizing.

FISHER: It's because those are the things that are most engaging to us because they speak to a sense of social compulsion, of a group identity that is under threat. Moral outrage, specifically, is probably the most powerful form of content because what you are experiencing is millions of years of evolution, the very specific environment that we, the human animal, evolved and had to learn to survive in, where you had to survive by finding a place and seeking the approval within this group, but also defending against outside groups. And no one in Silicon Valley deliberately decided to surface these, but they developed these systems that were just uncannily powerful at identifying what was going to be the thing that was going to hook us. And this just happened to be it.

SHAPIRO: So you give lots of examples of how this played out. One specific one is about a goal that YouTube set - to get a billion hours of watch time per day by 2016. And you write that to reach this goal, YouTube basically brain hacked millions of Americans in the middle of the most contentious election in modern history. Spoiler - they got their goal. How did they achieve it?

FISHER: So what the systems that govern YouTube and that govern what you see realized is that what would actually serve that goal is by providing you content that would create some sort of a sense that you and your identity were under threat. And so what that might mean is that if you're looking for, let's say health tips, let's say information about vaccines, the best thing for YouTube to show you isn't straightforward health information; the best thing for YouTube to show you is something that gives you a sense that you are part of some community, let's say moms who are concerned about their kids, and that that community is under threat from some outside danger and that that will trigger a sense of alarm that will make you want to come back and spend more and more time watching.

SHAPIRO: So the key question isn't, what's the best information we can give the user; the key question is, what will keep the user's eyeballs glued to our platform?

FISHER: And what's amazing is that if you go and look back at internal conversations within YouTube - and this is something they talked about openly at the time - is that they explicitly said, our goal should not be to surface the best information; our goal should be to surface content that is emotionally engaging, that will get people to spend more time on the platform. And they were saying this, like you said, right at the start of what would turn out to be arguably the most consequential election in American history.

SHAPIRO: You did some reporting in Sri Lanka, where people used Facebook and WhatsApp, which is owned by the same company, to gin up ethnic violence. And high-ranking Sri Lankan officials begged Facebook to do something before violence broke out. And you write that every single report was ignored. And then, after mobs took to the streets, destroying homes and businesses, the government finally reluctantly blocked all access to social media. Will you read what happened next?

FISHER: Yeah, sure.

(Reading) Two things happened almost immediately. The violence stopped. Without Facebook or WhatsApp driving them, the mobs simply went home. And Facebook representatives, after months of ignoring government ministers, finally returned their calls; but not to ask about the violence. They wanted to know why traffic had zeroed out.

SHAPIRO: I mean, what do you make of that?

FISHER: So the thing about these companies is they do employ a lot of really smart, thoughtful people who are, within bounds, trying their best to limit the harms of these platforms. But these are...

SHAPIRO: (Laughter) You're trying to spin this in the best possible light. It looks really bad.

FISHER: It's - yeah. Well, but the thing is, is these people do not ultimately have the authority and the power within these companies. The people who have the authority and the power are - just like in any major corporation, are the profit drivers. And those are the people who get that traffic up so they can sell ads against it and continue to make billions and billions of dollars.

And that is the thinking that prevails, in which Sri Lanka was a really striking case. If it's not even that valuable of a market, they don't even make that much money there. And the warnings that Facebook was getting ahead of this violence were so specific and from so many very senior people in government saying this is going to lead to an outbreak of a vicious, racial, religious mob violence, which is exactly what happened. And maybe somebody in the company did care, but not enough for any demonstrable changes to happen at the company. But as soon as the traffic dropped out, the company mobilized into action. And I think that this is an instructive story for what - behind all the rhetoric, behind all the high flying manifestos about how they're bringing us to a new stage of human evolution, what really drives and concerns these companies.

SHAPIRO: In light of all this, how do you explain the millions of people who use social media every day and don't get radicalized or pulled into conspiracy theories or flame wars?

FISHER: I think that's a really important question because for the overwhelming majority of us, the effect is subtle. Spending more time on social media will make you significantly more polarized. It will make you more prone to feeling - internally feeling in your own self outrage and moral outrage. And that is something that I think we do all feel and that might ring true to those of us who spend time on social media who don't become a crazy conspiracy theorist but will feel that pull on us.

SHAPIRO: Let's talk about solutions. You believe that eliminating these companies altogether would create a lot of harm. So what do you think the solution is to these problems? Is there a way to change the model so companies are not so incentivized to feed people outrageous stuff that'll keep them glued to the platform for hours?

FISHER: Whenever I would ask the experts who study this, you know, what do they think that the solution should be, it's always some version of turning it off; not turning off the entire platform, not shuttering the website, but turning off the algorithm, turning off likes, the little counter at the bottom of the post that shows you how many people liked it or retweeted it. That's something that even Jack Dorsey, the former head of Twitter, floated as an idea because he came to see that as so harmful. But turning off these engagement-maximizing features is something that we have actually experimented with. And a version of social media like that, I think, could potentially bring a lot of the good that they bring, which is real, and mitigate some of the harms.

(SOUNDBITE OF MUSIC)

SHAPIRO: Max Fisher, author of "The Chaos Machine - The Inside Story Of How Social Media Rewired Our Minds And Our World." Allison Aubrey contributed reporting.

(SOUNDBITE OF MUSIC)

SHAPIRO: It's CONSIDER THIS FROM NPR. I'm Ari Shapiro.

Copyright © 2022 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.