Facebook Says It's Aiming To Make Lying 'More Difficult' Facebook's head of security policy, Nathaniel Gleicher, said that the company is working harder than ever to counteract efforts to interfere in the 2020 presidential election.

Top Facebook Official: Our Aim Is To Make Lying On The Platform 'More Difficult'

  • Download
  • <iframe src="https://www.npr.org/player/embed/901915537/902091403" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

STEVE INSKEEP, HOST:

Will voters who rely on Facebook be any better informed this fall than in 2016? We've been talking with a man who says they can be. Nathaniel Gleicher is Facebook's head of cybersecurity policy, and he has a goal.

NATHANIEL GLEICHER: I think I actually want to make the act of trying to tell a lie or misleading people more difficult.

INSKEEP: Facebook provides financial support to NPR. We cover it the same as any company, which includes criticism of Facebook. It faces demands for Congressional regulation. It's also under pressure for false statements and conspiracy theories on the platform. Russian disinformation and domestic deception spread on Facebook in the last election.

Nathaniel Gleicher spoke with us as the company announced a new effort. Users will now see voting information centers, which provide what the name says.

GLEICHER: The voting information centers are going to include information about how to register to vote, how to check your registration status, how to vote by mail, how you request an absentee or a mail-in ballot. There will be election alerts from state election authorities about changes to the voting process. And there'll be facts about voting to help prevent or address any confusion about the process itself.

INSKEEP: This all sounds really useful, but are you concerned at all that what you have here is a small boat of accurate information in an entire sea of disinformation?

GLEICHER: First, I think it's important to know that this is only one piece of our election protection strategy. We have core teams that are hunting for deception campaigns. When they find them, we announce them publicly and we remove them from the platform. We also have automated systems that help tackle fake accounts and other types of deceptive behaviors and partnerships with fact-checking organizations to ensure that there's accurate information and context on posts.

But a piece of the puzzle is giving people accurate information. So it will be one piece, but it will be prominently displayed, and we'll be layering it on top and into News Feed to make sure people can see it as they're engaging with the election.

INSKEEP: I want to grant (ph) what you've said. You have made announcements of pages you've taken down, groups that you've banned. And yet, ProPublica in the last few days has published an investigation having to do specifically with false claims about voting, about mail-in voting - 350,000 views for a video on Facebook saying that if you mail in your ballot, Barack Obama is going to burn it - obviously, a false claim. A false claim that you can only vote by mail in California - hundreds of thousands of people saw that false claim. Now, Facebook did delete them, ProPublica says, but only after ProPublica called them out. Why do you think that happened?

GLEICHER: So we work with third-party organizations to identify any type of voter suppression or voter interference. A claim that misrepresents how you vote, where you vote or when you vote violates our community standards, and we will take that down.

INSKEEP: But in these cases, the posts were up and were seen by hundreds of thousands of people and had to be reported in the media before you found out about them.

GLEICHER: I'd like to get to every single one of these posts as fast as possible. Large amounts of this we find proactively and remove ourselves. We also work with state elections officials because often, for this type of thing, they might see it first, particularly if it's localized. And so what I've found is that you don't want to rely on just one tool.

INSKEEP: You know, I suppose you had a kind of trial run of this recently when the president of the United States made a false claim that was carried on Facebook - a false claim about mail-in balloting leading to a corrupt election. Of course, millions of people have voted by mail. There's no evidence of that whatsoever. Facebook did add a label with more information about voting, which sounds approximately like what you're talking about, but did not call the claim false. How come?

GLEICHER: So we want to make sure that people hear what elected officials are saying and what they think about voting. Quite frankly, I think that information is an important factor in how some people will choose to vote in the fall. And so we want to make sure that information is out there and people can see it, warts and all.

INSKEEP: If the president just says something false and nowhere do you say it's false, where's the context for that?

GLEICHER: I completely agree. I think accurate context is very important. That's why having that link right there provides as much accurate context as we can provide so that people can see, here's what the experts are saying, here's how the process works. And they can weigh that against what elected officials are saying.

INSKEEP: I want to ask about a couple of political movements that are based on false conspiracy theories, one of them being QAnon, which, roughly speaking, is the idea of a deep state of child predators who are attacking the president - no evidence for that whatsoever. But there is a Republican QAnon believer who won a congressional primary in Georgia the other day, and it appears to be a movement that is growing a lot on Facebook. Why would that be?

GLEICHER: Enforcing against QAnon is something that we've been doing pretty consistently, actually. Just last week, we removed a large group with QAnon affiliations for violating our content policies. We've also removed networks of accounts for violating our policies against coordinated inauthentic behavior. We have teams that are assessing sort of our policies against QAnon as they currently exist and are exploring additional actions that we can take.

INSKEEP: And yet, The Guardian was able to find the other day 170 groups, pages and accounts on Facebook and Instagram with 4.5 million followers and even adds - this is a quote from The Guardian - "Facebook's recommendation algorithm has continued to promote QAnon groups to users, and some groups have experienced explosive growth." Do you deny that?

GLEICHER: This is exactly the type of adversarial behavior we'd expect to see. We consistently see actors on the platform as we put tools and controls in place looking for ways to get around them.

INSKEEP: I'm looking for a yes or no there. The question is, are your algorithms actually promoting QAnon sites in spite of your best efforts?

GLEICHER: I think I would want to talk about specific instances to weigh in on that. It really depends on the particular site that we're talking about and the post that they're making.

INSKEEP: So you're not sure if that's true or not, with 170 different groups cited by The Guardian.

GLEICHER: This is a place where I think we have work to do. I think that the boundaries around what constitutes a QAnon site or not are pretty blurry, and that becomes a challenge in all of this. It's why we're looking at this and we're exploring some additional steps that we can take.

INSKEEP: There's another conspiracy political movement, the boogaloo movement, which wants a new civil war. The Tech Transparency Project has been studying them, identified 110 new Facebook boogaloo groups created just since June 30, which is when you said you'd crack down. And it was a well-intentioned effort, I'm sure, to crack down, but 110 new groups since then. And the Project says that the boogaloo adherents just avoid using the word boogaloo, and so they seem to escape your search engines. Is it that easy?

GLEICHER: We don't rely on any one particular keyword for exactly the reason you said. We've seen these actors constantly change their terms. We knew and expected that sites like this would come back as we removed them. And what I can tell you is we've been taking aggressive action against not just sites discussing boogaloo or pages discussing boogaloo, but pages and groups where we see links between those discussions and intent or efforts to engage in physical activity or push further towards real-world harm. And what that does over time is it drives these actors to spend more and more of their effort evading the controls we're putting in place as opposed to coordinating for the ends that they're seeking.

INSKEEP: I'm thinking about the fact that a very large part of the 2020 election is going to play out on your platforms. That seems like a basic reality. And I'd like to know your level of confidence. Are you willing to say with some confidence that most voters who rely on Facebook will be mostly well-informed if they do so? Are you willing to say that?

GLEICHER: I think one of the things we've learned is that protecting an election is a whole-of-society challenge. And what I can tell you is that we have teams that have been focused on finding and exposing these types of campaigns for years. And I think that the tactics we saw in 2016 will be much less effective today.

INSKEEP: Do you think that most voters, if they rely on Facebook, will be mostly well-informed?

GLEICHER: I think people are going to read on our platform based on what they're looking for and the people that they're talking to. What I can tell you is every single person in the United States is going to have at the very top of their feed and on posts that they read about voting accurate information about how to vote, how the process works and what the experts in state government and elections officials are saying about how the processes function.

INSKEEP: I think I hear you saying that you believe everybody is going to have an option to find accurate information, but it's going to be up to them to find it.

GLEICHER: Well, we're going to put that accurate information in front of people as many ways as we can.

INSKEEP: Nathaniel Gleicher is Facebook's head of cybersecurity policy.

Copyright © 2020 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.