Social Media Companies Brace For Post-Election Threats With so many people voting by mail, experts warn final results of the 2020 election may be delayed — opening the door for misinformation on social media. What are the platforms doing to prepare?
NPR logo

Social Media Companies Brace For Post-Election Threats

  • Download
  • <iframe src="https://www.npr.org/player/embed/919006714/919006715" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Social Media Companies Brace For Post-Election Threats

Social Media Companies Brace For Post-Election Threats

Social Media Companies Brace For Post-Election Threats

  • Download
  • <iframe src="https://www.npr.org/player/embed/919006714/919006715" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

With so many people voting by mail, experts warn final results of the 2020 election may be delayed — opening the door for misinformation on social media. What are the platforms doing to prepare?

RACHEL MARTIN, HOST:

Prepare for confusion come November. With so many people voting by mail this election, the results could be delayed. That uncertainty could create an opportunity for conspiracy theorists and misinformation to spread on social media. Facebook and Twitter say they see this problem coming. So what are they doing to prepare for the days after Election Day? NPR's Shannon Bond is taking a look. We should note, Facebook is among NPR's financial supporters.

SHANNON BOND, BYLINE: A lot of things keep Yoel Roth up at night. As head of site integrity at Twitter, a big part of his job is dreaming up all the ways bad actors could've used the platform.

YOEL ROTH: Having a vivid imagination is key. None of the threats are off limits.

BOND: When it comes to the election, there are lots of threats to plan for.

ROTH: A high-profile figure's account gets taken over, to the possibility of a large-scale spam or bot attack, to the risks of foreign interference like we saw in 2016.

BOND: Twitter, Facebook and other tech companies do regular threat modeling exercises both on their own and together. They've come up with different attacks and gamed out how they would respond.

ROTH: And we really undertook a process to try and predict what the worst-case scenarios were based on what we had seen previously in 2016, 2018 and in elections around the world, as well as some of the things that we thought were likely to happen in the United States this time around.

BOND: Facebook's chief operating officer, Sheryl Sandberg, told All Things Considered this week the company is on high alert for problems after November 3.

(SOUNDBITE OF ARCHIVED NPR BROADCAST)

SHERYL SANDBERG: We are worried about misinformation. We are worried about people claiming election results before the election. And we've already said, if any candidate tries to claim victory too early, if any candidate does inaccurate information, inaccurate ads after that election, we are going to take them down.

BOND: Facebook, which is among NPR's financial supporters, and Twitter are already dealing with a lot of election misinformation, like a president set on undermining the legitimacy of the voting process, as he did during this week's debate.

(SOUNDBITE OF ARCHIVED RECORDING)

PRESIDENT DONALD TRUMP: As far as the ballots are concerned, it's a disaster.

BOND: If results are delayed as mail-in ballots are counted, that means more opportunity to cast doubt on the outcome and amplify that doubt using social media. Of course, this isn't the first time we haven't known election results right away. In 2000, it took 36 days for a winner to emerge in the battle between George W. Bush and Al Gore.

CLINT WATTS: Yeah. There are some angry lawyers in Bush v. Gore. But it was pretty tame (laughter) compared to today.

BOND: Disinformation expert Clint Watts, who has long studied how Russia interferes in U.S. politics, says the threat in 2020 has really changed.

WATTS: And people couldn't be mobilized on social media in this way.

BOND: Facebook and Twitter have made rules against undermining election results or disrupting the peaceful transfer of power. Both say they are working to stop harmful content from spreading widely. But here's the thing, platforms are designed to make posts that really capture attention because they're emotional or sensational go viral. Turning that off would be a big change to the way Twitter and Facebook work. With so much at stake, experts say it's not enough that the platforms announced these rules. Chloe Colliver studies online extremism at the Institute for Strategic Dialogue in London.

CHLOE COLLIVER: The major gap now is not in the design of the policies on the platforms. The major gap is in enforcement of those policies.

BOND: And in those days after the election, if people don't know who won the presidency, disinformation may be spread by Russia, even by the president of the United States. The question is, can or will the social media companies hold the line?

Shannon Bond, NPR News.

Copyright © 2020 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.