Facebook Disputes Claims It Fuels Political Polarization And Extremism Facebook is giving users more control over what they see, as executives, including Nick Clegg, global affairs vice president, defend it from charges that algorithms favor inflammatory content.
NPR logo

Facebook Disputes Claims It Fuels Political Polarization And Extremism

  • Download
  • <iframe src="https://www.npr.org/player/embed/983155583/983314996" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Facebook Disputes Claims It Fuels Political Polarization And Extremism

Facebook Disputes Claims It Fuels Political Polarization And Extremism

  • Download
  • <iframe src="https://www.npr.org/player/embed/983155583/983314996" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

NOEL KING, HOST:

One line of criticism against Facebook is this. The platform captures and holds our attention while its algorithms dictate what we see. And so the company just has too much influence on the national conversation. Those critics say Facebook has to take some responsibility for the polarized times that we're living in. Nick Clegg is Facebook's vice president of global affairs and communications. Yesterday on Medium, he posted a 5,000-word essay addressing critics. We called him on Zoom to talk about it. And full disclosure, Facebook is one of NPR's financial supporters.

NICK CLEGG: This essay tries to deal with some of those wider concerns about social media, concerns about social media and the impact on polarization in society and so on, but also and more specifically, focuses on this fundamental issue. Who's in the driving seat? Is it us, you know, human beings? Or is it the machines? Is it the algorithms? And what I seek to explain is that, certainly, as far as Facebook is concerned, people are often, if you like, much more in control or much more in charge than they sometimes feel like they are.

And what I announced in this article is some additional controls which give people real transparency in how the systems work and allows people to pull levers. So for instance, in a new feature, which I announced today, you'll be able to, in effect, override the algorithm and curate your own news feed composed of posts and groups and people who are your favorites. And so it's all about, in a sense, enhancing the control that users have as they use these sophisticated, new social media communication tools.

KING: I understand how that might help in some senses. But what if I'm the user who wants a bunch of posts about how to overthrow the U.S. government?

CLEGG: Well, if you are invoking violence, then of course that will be removed. And you won't be able to post that altogether. And we remove a significant amounts of content all the time where it breaks the company's own rules. For instance, just on COVID, we remove misinformation which could lead to imminent physical harm since March of last year. So over the last year, for instance, we've removed more than 12 million pieces of content on Facebook and Instagram where we feel that the information or the post would, you know, promote fake preventative measures or exaggerated cures, which would harm people. So if that's what you want, well, you can't do that on Facebook and Instagram. If you want to discuss politics, of course, you know, we live in a free society, thankfully. And you're free to do so, you know, on social media, just as much as you're free to do so sitting around your kitchen table.

KING: Toward the end of your essay, you warn against blaming Facebook's algorithm for divisiveness and for hatred. You write, quote, "we need to look at ourselves in the mirror and not wrap ourselves in the false comfort that we've simply been manipulated by machines all along." In your view, how much of the solution and how much of the blame lies with individual users?

CLEGG: This really isn't about apportioning blame. The point I was trying to make was that there are very, very popular ways of communicating, you know, messaging apps - iMessage, Telegram, Signal, WhatsApp and so on - where, you know, millions, billions of people use that, you know, every second of the day to communicate with others. And yet there's no algorithm involved in those. And yet they're also, of course, a route by which people say unpleasant and hateful things as well as beautiful and uplifting things. And so the point I was just trying to make was that it's just foolish to say it's all the user's fault, but equally to say it's all somehow a faceless machine's fault. It's the interaction between the two.

And the research, which I cite in the piece, you know, suggests that the reasons, for instance, for polarization in the U.S., you know, precede - polarization was developing decades before social media was even invented. And I guess what I'm trying to do, which is difficult because sometimes, quite understandably, people want, you know, simple answers to what are complex issues - I'm sort of urging us nonetheless to try and grapple with the complexity of this and not try and all reduce it to some faceless machine that we blame for things that sometimes lie deep within society itself.

KING: Nick Clegg is vice president for global affairs and communications at Facebook. Mr. Clegg, thank you for your time.

CLEGG: Thank you.

Copyright © 2021 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.