SCOTT SIMON, HOST:
We begin the hour with fallout from the Facebook files. Internal documents show that Facebook employees raised the alarm about Stop the Steal, the rallying cry of that false claim that, somehow, the presidential election had been stolen from Donald Trump. That cry turned into a rally which turned into a deadly attack on Congress and the vice president and on democracy. NPR's tech correspondent Shannon Bond joins us. And we will note Facebook is among NPR's recent financial supporters. Shannon, thanks so much for being with us.
SHANNON BOND, BYLINE: Thanks for having me, Scott.
SIMON: You have been looking at internal documents from Facebook. What do they show?
BOND: Well, we know Facebook spent years preparing for the 2020 election. It didn't want a repeat of what happened in 2016 - of course, when Russians used the platform to interfere. So the company had this emergency playbook. It called these break-the-glass measures. There were temporary steps to keep the platform safe. So, for example, it down-ranked posts that might break its rules against violence or hate speech. It means it made them less visible to other people to give them more time to check them out before they spread widely. But like you said, the election largely went off without these big issues. And so afterwards, Facebook started to turn off a lot of these emergency measures, and the ones that did keep in place didn't turn out to work so well.
SIMON: And what did that mean for the Stop the Steal movement?
BOND: Well, one thing that Facebook did was to try to limit just how fast these groups - it called civic groups that are related to politics - how fast they could add new members. So it put daily limits on invitations. But the Stop the Steal groups got around that. The first Stop the Steal group Facebook took down after just two days, but more kept popping up in its place, and that kept happening. It was this game of whack-a-mole. And these organizers were also able to elude detection by choosing their words really carefully and by posting these disappearing stories. And as a result, these groups - they were filled with calls for violence, lies about election fraud. And they were growing faster than Facebook could keep up.
And this left a lot of employees inside Facebook really upset. These were researchers and folks who work on keeping the platform safe. And they had been warning, these documents show, that civic groups especially were a problem that Facebook needed to crack down on. But they said leadership just didn't take these warnings seriously enough. So, you know, on January 6, one wrote on Facebook's internal message board, quote, "Rank-and-file workers have done their part to identify changes to improve our platform but have been actively held back." Some people I spoke to who have since left the company said, you know, Facebook really should have kept these emergency measures in place much longer after the election or even made permanent changes to how it works.
SIMON: And what does Facebook say now?
BOND: Well, the company says, look. The blame for January 6 lies with the people who stormed the Capitol and with the people who encouraged the violence. It says it took a lot of steps to protect its platform, and it's proud of the work it did. And when it comes to, you know, what it could have done, what it should have done - Facebook says, look. These emergency guardrails, these break-the-glass measures are blunt instruments with big trade-offs. Of course, if you slow down the platform for these bad groups, you'll also have to slow it down for everybody else. You know, that has an impact on growth. And Facebook is trying to strike this balance, right? It says it doesn't want its platform to be used to incite violence. It also doesn't want to be seen as censoring people, which, of course, is something Republicans and Donald Trump accuse it of doing.
SIMON: Shannon, you've based your reporting on internal research and messages and documents. Please tell us how you got those.
BOND: That's right. So recently, you know, we've heard about a former employee named Frances Haugen, who's come forward as a whistleblower. She worked on a team protecting the platform and left Facebook earlier this year with this trove of thousands of pages of internal documents. She says she came forward because she thinks Facebook is not making the right trade-offs when it comes to safety versus growth. So she turned those documents over to federal regulators, to Congress. They were also shared to a consortium of news organizations, including NPR. And I should say that Facebook widely disputes her claims.
SIMON: NPR's Shannon Bond, thanks so much.
BOND: Thank you.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.