Yael Eisenstat, ex-Facebook employee, says company knew for years about problems Yael Eisenstat, who was in a leadership position at Facebook in 2018, says people within the company have been warning for years about disinformation and how extreme voices are boosted on Facebook.

Ex-Facebook employee says company has known about disinformation problem for years

  • Download
  • <iframe src="https://www.npr.org/player/embed/1048918722/1048918723" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript


We're going to focus now on another allegation against Facebook - that the company didn't do enough to prevent extremists from organizing online and ultimately attacking the U.S. Capitol on January 6. According to the documents, Facebook did put measures in place to deal with potential violence around the 2020 election. But when a movement built on a lie got traction - it was called Stop the Steal - those measures fell short.

Yael Eisenstat was the global head of elections integrity operations at Facebook in 2018. I asked her how this misinformation was allowed to spread.

YAEL EISENSTAT: This is something that was years in the making and - maybe not Stop the Steal in particular, but just the fact that for years people, whether it was people like myself, people who worked in the company, researchers, journalists, outside academics had been warning the company that the way their product is working, which is to - I mean, there's multiple things here - the way they amplify certain kind of content, the way they recommend people into certain groups was causing a situation that was already amplifying some of the most extreme voices. But then you couple that with what I believe was one of their most dangerous political decisions the company made - and this was actually around the time that I was working there - to not fact-check political actors, to allow some of the biggest voices with the biggest platforms to violate their own policies, these combined to make this perfect storm.

So when they talk about all the measures that they put in place to protect the elections, that doesn't override all the things that they were not doing, such as really making sure that groups weren't engaging in this kind of activity, making sure that certain political elites were not using the platform to spread lies about the election. This was going on for years. It's not something they could have stopped suddenly in three days after an election.

MARTIN: So it seems like there's two issues, as you just laid them out. It's vetting the potential of any particular actor to spread lies and misinformation but also, once it's out there, taking it down, right?

EISENSTAT: So even more, I mean, content moderation - what to take down and what to leave up - is extremely difficult. And yes, that is something that is on the platforms to handle because that is definitely not government's area. But how your platform is designed, how you're monetizing it, what you're allowing and what you're not, basic guardrails - that's what's really missing here.

I mean, as you said, once - if you, for years, have allowed an environment where you're not holding politicians to the same standards that you're holding the rest of us to on your platform because, really, you want to preserve your power and you're not sort of reining in these algorithms and recommendation engines that are pushing people to a lot of this content, which some of the documents - I mean, this piece about Carol's journey to QAnon is the perfect example. And that's really about not wanting to harm your own growth. So you're not reining in how your recommendation engines are pushing people into some of these groups. Those two things combined around a time where we have a really volatile situation in the U.S. around our election, it's a perfect storm.

And let's be clear, Facebook had years to fix this. A lot of the documents are focusing on 2019 and 2020. But this is work that many people have been trying to get them to do, including my team when I was there, for years.

MARTIN: What were you met with when you raised these red flags, when you raised these concerns?

EISENSTAT: Sure. So in 2018, the very first thing I tried to do was ask why we were not putting any sort of fact-checking standards into political advertising. And why I did that was it was very clear to me that advertising, first of all, maybe not the most important thing on the platform, but it's paid speech. We're putting labels on the ads to make them look even more credible. And we're giving the political actors targeting tools to target people with these messaging.

And if we weren't even fact-checking that and we were allowing politicians to use advertising to sow different messages to different groups, that was dangerous. And there was no appetite. I was pushed out for these kinds of ideas and for the ideas to try to build in voter suppression plans into political advertising. There just wasn't an appetite from leadership for that.

MARTIN: You think that's likely to change now with all this pressure?

EISENSTAT: I think it can only change through regulation, to be honest. They have made it clear they will not self-regulate.

MARTIN: Yael Eisenstat, she's Facebook's former global head of elections integrity operations for political advertising. We appreciate all of your perspective. Thank you.

EISENSTAT: Thank you.

Copyright © 2021 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.