Yaël Eisenstat: Why we need more friction on social media Facebook profits from being frictionless, says Yaël Eisenstat. But without friction, misinformation can spread like wildfire. The solution, Yaël says, is to build more friction into social media.

Yaël Eisenstat: Why we need more friction on social media

  • Download
  • <iframe src="https://www.npr.org/player/embed/1127249176/1127427038" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

MANOUSH ZOMORODI, HOST:

It's the TED Radio Hour from NPR. I'm Manoush Zomorodi. On the show today, friction. And in the tech world, entrepreneurs want as little of it as possible. Their goal is to build frictionless platforms and gadgets that are intuitive to use and keep us coming back.

YAEL EISENSTAT: What has happened is now we expect everything to be first, fast, free and frictionless. But are we sure that's the best idea when it also comes to political rhetoric, to how we debate incredibly important topics that matter for our entire planet?

ZOMORODI: This is Yael Eisenstat. She is known as an advocate for building slower technology and as a Facebook whistleblower, but we'll get to that. Her story really starts on the Kenyan-Somali border right after 9/11, where she was working in counter-extremism for the State Department.

EISENSTAT: Everything that I learned from my time overseas was that things take time. Engaging with people who aren't like-minded takes time, you know. But a big part of my role was sitting down with communities who really hadn't had exposure to Americans or to many Americans and, as cliche as it sounds, sort of building bridges. As opposed to being this here, I'm American, I'm here to help, let me tell you what you need, it was more let me listen and figure out if there's ways for us to cooperate or at the very, very least, find that we actually have some form of common humanity. It's very soft skill, right? It's a much more diplomacy side than more hard skill of other sort of counterterrorism tools.

ZOMORODI: The point of this diplomacy was to slow the spread of extremism, add friction by taking lots of time to just talk.

EISENSTAT: I mean, I would go back to the same villages over and over again, and I would sit and drink tea for hours. I would attend community meetings. I would sit with women's groups and just chat about what it means to be a woman in America and what it means to be a woman in their community. I mean, a lot of it sounds kind of weird and non-quantifiable and all of that, but it is, it really is spending a lot of time engaging mostly with people who have very different experiences than you. That is already friction.

ZOMORODI: So when did you realize that this diplomacy, this kind of friction, needed to be deployed back home in the U.S.? Like, when did you start to see extremism spreading here?

EISENSTAT: Yeah. So, you know, I left government in late 2013, and my goal at that point - you know, I always focused overseas. I think I had taken for granted that we were - that our democracy was secure. We were OK at home. My role was to focus on conflict abroad, on threats coming in from overseas. I didn't really have my focus on the U.S. And in 2015, just the way the rhetoric was going, the way people, largely online but offline as well, were starting to engage with each other just completely started paralleling things that I had seen in my counter-extremism days and made me do a complete 180 and focus all my efforts on, oh, my gosh, what is happening in the U.S.?

To be clear, I don't mean that I feel that everyone should get along and have the same political views and everyone should be polite to each other. It's not that. But there's a difference between disagreeing over issues and just fundamentally hating the person who has a different opinion than you. And so that's when I started digging in and really trying to figure out what was that because I truly, truly believed that we were becoming not only our worst enemy, our own worst enemy, but that we Americans were starting to become radicalized in some of the exact same steps that I had seen that people had been radicalized in different communities around the world, including the ones that I had worked with along the Somalia border. And so I didn't have the answers. I didn't know what it was about yet, but I knew something really terrifying was starting to happen in the U.S. at that point.

ZOMORODI: How much did you blame social media or, I don't know, Fox News or - where were you looking in terms of the source of the problem?

EISENSTAT: I mean, sort of all of the above, but - so, yes, it started with really looking at the news but then honing in more and more on social media and, to be clear, not because I think social media is at fault for all of our societal ills or for some of the very real rifts in American society. But it did start to become more and more clear that the way certain social media companies were designed, they were taking advantage of those rifts and starting to monetize that anger, that divisiveness. And that's why I started getting really focused on social media.

ZOMORODI: One of the problems you helped identify and draw attention to was that these platforms have a lack of friction. Like, there is no spending hours drinking tea and debating ideas online. These companies help us share information as easily and as fast as possible, to the point where things, many say, have spiraled out of control.

EISENSTAT: Yeah, absolutely. This is a world optimized for frictionless virality. And if you want to be able to compete with the world that our online ecosystem has created, you have to be frictionless. And we all know now that companies like Facebook, for example - or I guess we'll call them Meta now - their entire business model is they want you on their platform as long as possible. And so constantly feeding you content as quickly as possible is part of how they do that. And the idea of actually building a system to help you slow down, building a system with friction that allows you to stop and question, am I sure this is even true - like, that's what friction is. Friction is these signals that helps you slow down so your brain can actually process what it's receiving. And that's not how these platforms are designed. I mean, it's kind of antithetical to how they make their money.

ZOMORODI: OK. So here's the twist, though, Yael. Here is the twist to the story, to your story, is that here you are, a Facebook critic, and then you went to work at Facebook.

EISENSTAT: Yep.

ZOMORODI: All right. Before you tell us what happened, just want to mention Facebook parent company Meta pays NPR to license NPR content. Get that disclosure out of the way. Tell us what happened.

EISENSTAT: So I started speaking to more and more audiences, especially of technologists. And the more I started speaking about this, the more my name started to get out there. And I think at some point, I started honing in on some of the way - you know, I learned from others, also, how is Facebook designed? How is it incentivized? And the more I learned, the more I started speaking about them. And then they called.

ZOMORODI: What'd they say?

EISENSTAT: I mean, so, this is where I like to say they're very good at telling you what you need to hear because, oh, yeah, no, we need that. That's exactly what we need, you know. And they started making me feel like they meant it. And then on the same day that Mark Zuckerberg testified in the Senate, that famous hearing in 2018 about Cambridge Analytica, I listened to the entire thing and heard Mark Zuckerberg say over and over again how much he was going to prioritize elections integrity. And then a minute after that hearing ends, they call me with an actual offer. And the offer is to be their elections integrity head in what was called their Business Integrity Division, which, it's the part of Facebook that really works to protect advertising, to protect the things they monetize from bad actors or from, you know, whatever it is.

So I went in, I would say, cautiously optimistic that maybe this 2018 moment, this Cambridge Analytica scandal, the 2016 elections, all of that, maybe the company truly did want to finally figure out who do they want to be in this space. This is not a company that's just connecting friends or just, you know, serving you cute cat videos. This is a company having a profound impact on so-called public squares, on how political engagement happens, on elections themselves. And so I thought maybe this really is a pivot point for them. So, yeah, how could I say no?

ZOMORODI: Tell me about day one or the first few days. What do you remember of that time?

EISENSTAT: I mean, the first few days were insane, to be frank. So day one, it really did feel a little bit like a cult indoctrination. It was a lot of, like, you're the smartest in the world. The only way you got hired by Facebook is because you're the best. You're the brightest. Day two, I had my first meeting with my boss, and my boss told me in that very first meeting that they're changing my title. They're going to figure out my job description. And now, when I was hired to be the global head of elections integrity ops, yeah, we're just going to call you manager now until we figure out what to really do with you.

ZOMORODI: So what did you do in those first few days? Like, did you start making trouble?

EISENSTAT: Probably. So, you know, I did - I started reaching out to as many people as I could. I wanted to understand why we would fact-check certain information on the news feed, but why were we refusing to do the same thing in advertising and started posing questions about it. And so my team put together this amazing plan on how to, at the very least, ensure that political advertising was checked to make sure it wasn't engaging in lies about how to vote, where to vote, when to vote - like, your most basic online voter suppression tactics. And when I sent that up the chain, that we had a whole plan, it was coordinated across multiple parts of the company, that it wouldn't be about censoring speech, it would be very specific about voting information - and this was, by the way, to protect the U.S. midterm election that was coming up - I was pulled into a very senior person's office. I was yelled at that I made them look bad. I was accused of all sorts of things that really were shocking to me.

ZOMORODI: But what was it that they disagreed with? I mean, the tools that you were proposing, like, what was it about them that they didn't want to do?

EISENSTAT: So there are two things there. The first is what they said, which was as soon as we sent up this plan - and again, it's a plan to basically make sure that political advertising ahead of the 2018 midterm elections was not engaging in voter suppression tactics - and the first thing was, well, what is the prevalence right now of that, which is their way of saying, is it a problem? My response was no because political advertising hasn't started beefing up yet for the 2018 election. What I'm doing is helping anticipate a problem that's coming so that we can stop it in advance. Well, that's just not really the Facebook way, because what I was - I mean, back to the word friction, what I was proposing is going to put friction in the system. But after everything, after Cambridge Analytica, after the Russian interference in our election in 2016, you don't recognize how important it is to make sure you don't let your platform be used in a way to negatively affect our election. That, to me, was shocking. So that was - that's one side of it.

The other side of it is a political decision. Fundamentally, that moment was a political decision on behalf of Mark Zuckerberg and others at the top to ensure that they were not angering the party in power at the time because, to be frank, if we were to start cracking down on ads that were engaging in lies about the election, that would disproportionately affect one party over the other because one party was engaging more in lies about the elections than the other. The funny thing is, I came in having worked in a nonpartisan world my entire life. I worked for three different administrations, Republican and Democrat. What I was trying to do is protect the integrity of our elections by building plans, some of which might slow things down, some of which might mean that you have to reject some ads. But my plan would also possibly anger certain political actors that Mark Zuckerberg did not want to anger.

ZOMORODI: And eventually, not too long after this, you were fired.

EISENSTAT: Yeah.

ZOMORODI: So instead of adding friction to the platform, you became the friction...

EISENSTAT: Oh, my gosh (laughter).

ZOMORODI: ...Essentially, at Facebook.

EISENSTAT: I was definitely the friction. Everything I was proposing definitely would have added some friction in and...

ZOMORODI: Yeah. And my understanding is, though, that in 2020 there were actually attempts to add some friction to these social media platforms ahead of the election. So what did that look like? What did you see?

EISENSTAT: Yeah, so, you know, for Twitter, for example, if you wanted to retweet an article, before you could retweet it, you would get this pop up, I think it was, asking you, have you read this article? That is friction.

ZOMORODI: I got that pop up. Totally.

EISENSTAT: Yeah. Building in friction says before you share this, have you read it? Or another way to add in friction is if you're going to share this article, you have to actually add your own thought. You have to add words into your tweet. And that might seem like a bit of an annoying thing, but think about what it's doing. It's making you slow down. It's making you actually think for a second, is this a good idea? Do I even know what this is really saying? And so that's what friction is. The funny thing is, to the best of my understanding, Twitter actually did see that some of the political rhetoric was tamped down a bit and misinformation was actually slowed ahead of the 2020 election.

But my understanding is, Twitter admit that this worked but then after the election reversed it. So, listen, the companies know how to do these things. The real question is, why don't they? And that's because that's not going to help them make more money. The power that comes with that, the money that comes with that, is more important than figuring out how to create healthier discourse or even the bare minimum of figuring out how to ensure that you're not allowing people to fundamentally destroy trust in our elections. I mean, the big lie, which spread like wildfire on social media - Facebook had many ways to build in enough friction to slow that down and then to possibly decide, you know what, we're not going to let the big lie spread. They had all of those tools. They just chose not to use them.

ZOMORODI: You know, there will be people listening who are like, this ship has sailed. You raising the flag is not going to do anything. And I wonder, A, if you think that's true, and B, what you think that means for upcoming elections here in the U.S.

EISENSTAT: That's a great and difficult question. I would say this - a lot of this is very much in the public consciousness now. I mean, people say, oh, we all know that Facebook does X, Y or Z. That comes from years of people raising these alarms. So I personally will disagree with that ship has sailed. But that said, I am now working with groups that truly want to change the incentive structures around how our public discourse happens. And I'm super encouraged that there are companies really thinking about these things differently. The question is whether they'll be funded, whether they'll be successful. But every little piece of the puzzle, whether it be legislation that's written, whether it be companies who want to operate differently - every single one of those nuggets matter. No, it's not happening fast enough. Yes, I am still terrified for upcoming elections because so many fundamental issues have not been fixed. But if I can't hold on to a little bit of grain of hope of the sort of dinosaur slow societal shift, then I just lose all hope. So I do think that these things still matter. I do think we can continue to both pressure the companies, pressure our governments, and pressure the people who finance the next wave of companies to think differently about all of these incentive structures.

(SOUNDBITE OF MUSIC)

ZOMORODI: That's Yael Eisenstat. Since we recorded our conversation, she's been appointed vice president and head of the Center for Technology and Society at the Anti-Defamation League. You can see her full talk at ted.com. Also, we reached out to Meta for comment on Yael's allegations but, as of this recording, have gotten no response. On the show today, Ideas About Friction. I'm Manoush Zomorodi. And you're listening to the TED Radio Hour from NPR. Be right back.

Copyright © 2022 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.