Hear What A Facebook Insider Told Congress About How Its Apps Hurt Kids : The NPR Politics Podcast Former Facebook product manager Frances Haugen told senators that the company knows its products harm children and stoke division, but that executives have continued to prioritize growth over safety.

This episode: White House correspondent Scott Detrow, congressional reporter Claudia Grisales, and tech correspondent Shannon Bond.

Connect:
Subscribe to the NPR Politics Podcast here.
Email the show at nprpolitics@npr.org
Join the NPR Politics Podcast Facebook Group.
Listen to our playlist The NPR Politics Daily Workout.
Subscribe to the NPR Politics Newsletter.
Find and support your local public radio station.

Hear What A Facebook Insider Told Congress About How Its Apps Hurt Kids

  • Download
  • <iframe src="https://www.npr.org/player/embed/1043465312/1043475803" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

KIERSTEN: Hi. This is Kiersten (ph) from Grand Canyon, Ariz. Yes, that Grand Canyon. I'm on my lunch break from being a park ranger. This podcast was recorded at...

SCOTT DETROW, HOST:

(Laughter) I assume that has better lunch break views than, like, any other place in the world. It is 2:07 Eastern on Tuesday, October 5.

KIERSTEN: Things may have changed by the time you heard it, but I will still be asking guests to not feed the squirrels.

DETROW: (Laughter).

KIERSTEN: All right. Here's the show.

(SOUNDBITE OF THE BIGTOP ORCHESTRA'S "TEETER BOARD: FOLIES BERGERE (MARCH AND TWO-STEP)")

DETROW: Hey, there. It's the NPR POLITICS PODCAST. I'm Scott Detrow. I cover the White House.

CLAUDIA GRISALES, BYLINE: I'm Claudia Grisales. I cover Congress.

DETROW: Claudia, I recently learned the White House is actually a national park as well. The ground it is on is a national park. There are squirrels there, too. No one is telling me not to feed the squirrels though.

GRISALES: I know. I think we have more squirrel freedom here with what we want to do.

DETROW: Yeah (laughter).

GRISALES: Take that.

DETROW: So today in Congress, a whistleblower who used to work at Facebook is in front of a Senate committee testifying about the way she says the company has misled the public.

(SOUNDBITE OF ARCHIVED RECORDING)

FRANCES HAUGEN: During my time at Facebook, I came to realize the devastating truth. Almost no one outside of Facebook knows what happens inside of Facebook. The company intentionally hides vital information from the public, from the U.S. government and from governments around the world. The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages. I came forward because I believe that every human being deserves the dignity of the truth.

DETROW: That's Frances Haugen, a former product manager for Facebook. She has provided documents backing these claims to federal authorities and to news outlets. NPR's Shannon Bond is also here again. Hey, Shannon.

SHANNON BOND, BYLINE: Hey, Scott.

DETROW: And, Shannon, you cover Facebook. We do need to say right off the top that Facebook is among NPR's financial sponsors. But we are going to talk about this testimony today and how this possibly changes the picture when it comes to how Congress regulates Facebook and social media. Shannon, can you start by reminding us how Haugen found her way in front of this committee, because she has made a remarkable series of choices that have been coming public in very different ways over the last few weeks?

BOND: So Haugen used to work at Facebook. She spent about two years there working on a team called the Civic Integrity - Combating Political Misinformation. And what's important to know about her is her background is actually in how algorithms are designed, right? So like, how social networks like Facebook decide what to show users. You know, she basically grew pretty disillusioned with the company. She felt like it was failing to make its platform safer if those changes risked its growth and profits. And it was, you know, hiding what it knew about the problems on its platform from the public and from regulators.

So when she left Facebook earlier this year, before she left, she copied tens of thousands of pages of internal documents. These were things like internal research, communications. And a lot of these documents appear to show that the company was very well aware of the ills of its platforms - things like the...

DETROW: Yeah.

BOND: ...Toxic risks of Instagram to some teenage girls, to mental health, the prevalence of drug cartels and human traffickers, and, you know, issues with how its algorithms seem to be actually encouraging a lot of negative, angry, divisive posting. And she says all of that is evidence that the company puts profits ahead of people. And so these documents, they formed the basis of this big investigative series by The Wall Street Journal. But she also shared them with the SEC and with members of the Senate committee. And so today the Senate committee wanted to hear from her.

DETROW: One of those moments was Minnesota Senator Amy Klobuchar talking to Haugen about this specific example, and I think it's some of the reporting that's gotten the most attention about how this content, you know, how Facebook employees were bluntly talking about the way that some of the content on Instagram, in particular, can lead to eating disorders among teenage girls.

(SOUNDBITE OF ARCHIVED RECORDING)

HAUGEN: Facebook knows that their - the engagement-based ranking, the way that they pick the content in Instagram for young users, for all users, amplifies preferences. And they have done something called a proactive incident response where they take things that they've heard, for example, like, can you be led by the algorithms to anorexia content? And they have literally recreated that experiment themselves and confirmed, yes, this happens to people. So Facebook knows that they are leading young users to anorexia content.

AMY KLOBUCHAR: Do you think they are deliberately designing their product to be addictive beyond even that content?

HAUGEN: Facebook has a long history of having a successful and very effective growth division where they take little, tiny tweaks, and they constantly, constantly, constantly are trying to optimize it to grow. Those kinds of stickiness could be construed as things that facilitate addiction.

DETROW: Shannon, I have several questions for you based on that. First of all, just to help listeners out, can you give us a good definition of the difference between engagement-based ranking and other ways that social media platforms can show you what you're looking at? Because that's an important part of this conversation.

BOND: Yeah, it absolutely is. I mean, and I think it's something that could sound a little technical, but actually, anybody who spent any time on social media, you know, you're going to be familiar with this. So what Facebook is doing is choosing what to show you, right? They're not going to show you every single post that comes out. What they're trying to look at is, what have you interacted with in the past? You know, what are you most likely to be interested in? What are you most likely to comment on or re-share yourself? And they're using all of that - those measures of engagement to decide - what to show you, what they think is going to be most relevant ultimately to keep you on Facebook, to keep you engaged. Same thing on Instagram.

And so what Haugen is saying is that kind of engagement-based algorithm, those - that kind of ranking, which is what a lot of her expertise is in, you know, is actually having really, really harmful effects because, you know, in the example she was giving about eating disorders, I mean, you know, what may happen is, in her words, a teenager is on Instagram and is - searches for something about dieting content. Well, that's a signal to Instagram about what she's interested in. And then, you know, that signal can be reinforced. Suddenly, the algorithm's like, OK, great, we're going to show you more dieting content, more fitness content. And that can quickly escalate into, you know, eating disorder content and other issues.

DETROW: Claudia, was anyone on the panel defending Facebook or even just saying, hey, maybe this is more nuanced? Or was this a situation where, shockingly on a lot of fronts, Republicans and Democrats came into this with the same worldview?

GRISALES: Yeah, that was another detail that stood out today is kind of the bipartisanship, if you will, in terms of concerns that they had and in some ways are being validated by these documents, that they were worried that the social media products were having a destructive impact on individuals, including children. And they're raising that alarm along with this witness and saying that something needs to be done.

DETROW: Shannon, how has the way that Congress has approached issues like this changed over the years? Do - are lawmakers more serious about putting actual regulations in place? And more broadly, do lawmakers seem to have a better understanding of how these massive companies actually work? Because we have seen many examples over the years of lawmakers not quite getting it and almost talking a foreign language when they try to ask serious questions to executives from Facebook and Twitter and places like that.

BOND: I think they're just taking this a lot more seriously. There's been a huge amount of engagement. I mean, today's hearing was really quite different because, you know, it was a real novelty not to just see these senators yelling at a representative of the company - right? - like in that sort of performative way. A lot of the questions really were incisive and I think reflected, you know, a much better grasp of how this company works, how these algorithms work. You know, what are these design decisions that Facebook is making? I think they're really, you know, getting in much harder at that.

DETROW: All right, a lot more to talk about on this. But first, we're going to take a quick break.

(SOUNDBITE OF MUSIC)

DETROW: We are back. As all of this has come out, as all of these damning reports have come out in this testimony, Facebook has, of course, been responding. A Facebook representative was on NPR's Morning Edition this morning.

Shannon, what's the general overview of what Facebook is saying about Frances Haugen's testimony and the documents she's provided?

BOND: Yeah. I mean, the company is, I think you would expect, pushing back pretty aggressively on this. They've actually just - in my inbox just now, I have a statement from their policy communications director. The issue that Facebook is hitting on right now in the immediate aftermath of the hearing is basically saying, you know, look, Haugen didn't work on a lot of these issues. She didn't work on Instagram. She didn't work on child safety. They're saying she didn't have any direct report. She wasn't sort of in a decision-making role at the company. And I think they're implying here she's out of her depth. And they've also said they don't agree with her characterization of many of these issues, of the research. They say it's been mischaracterized and that they do care about user safety, and it's not accurate to say that they put profits over safety.

I mean, I think what is - what we've seen, though - I mean, Haugen admitted this much in the hearing. There were many times when she was asked about things by these senators, and she said, well, I didn't work directly on that or, you know, that's outside of my - you know, what I did. But what she was able to do and what's so powerful about her testimony - right? - was point to the internal documents, point to the research and say, this is what people at Facebook were saying. This is what the results of their own surveys were saying. And, you know, I think that's really compelling. And I think it's harder in a lot of ways for Facebook to push back against that.

DETROW: And that's what I want to ask you about. Because when you think about whistleblowers in the past who have really changed the direction of things, I think there's two camps, right? There's the shocking-new-revelations-that-nobody-knew-anything-about camp, and there is the camp where they're talking about something that a lot of people generally had a sense of, right? Like, is tobacco bad for you? Of course, it is. But here are lots of incriminating or really damning details of how that conversation is happening inside the company.

For you, you have covered Facebook for a long time. Which camp is this falling into? Like, are you learning things you have never heard before about Facebook or is this kind of, oh, that's how they're having these conversations?

BOND: I mean, I think it is a bit of both. I mean, I think on the latter point, you know, I think what makes Haugen a really remarkable witness here and makes these documents so compelling is that they are coming from inside Facebook. These are criticisms we've seen lawmakers make of the company. We've seen outside researchers ringing alarm bells - right? - about, you know, how engagement is driving, polarization is driving, you know, really divisiveness and anger. Or, you know, what are the mental health impacts of, you know, a very photo-focused platform like Instagram on, you know, young users?

But what's different here is that we are seeing how the company itself is talking about it. What I've learned and what surprised me in a way, you know, is, like, seeing in these documents just how concerned Facebook is about its future, right? I mean, this is something the New York Times' Kevin Roose wrote about today, you know, this issue of many of their products are not growing as fast as they used to, especially in the U.S.

You know, this whole focus on Instagram and teenagers and this idea that the company has now put on hold of creating a version of Instagram for younger users, you know, a lot of that motivation seems to be driven by the fact that Instagram is losing market share among kids to rivals like TikTok. And they're worried. Like, what is the future of the platform if they don't get this next generation of users?

GRISALES: You know, if I could add, this testimony today really resonated with me in two different ways - as a journalist wondering about the issues they're dealing with and really bringing that out in detail and also as a parent to two teen girls. Instagram was a huge battle in our house. I had to bribe our girls multiple times to keep them off Instagram for a number of years because I was so worried about the impact.

DETROW: Yeah.

GRISALES: And just hearing all this, you just realize, wow, there really were - these are some of the issues I suspected might be happening, that it could negatively impact teens this way. And it just, it's coming out in living color today.

(SOUNDBITE OF ARCHIVED RECORDING)

HAUGEN: Eating disorders are serious, right? There are going to be women walking around this planet in 60 years with brittle bones because of choices that Facebook made around emphasizing profit today. Or there are going to be women in 20 years who want to have babies who can't because they're infertile as a result of eating disorders today. They're serious. And I think there's an opportunity here for having public oversight and public involvement, especially in matters that impact children.

DETROW: Claudia, let's end with this, though. You have noted that there is a surprising - I guess it's surprising and not surprising. Because on one hand, this is a very clear-cut issue, and these details are really alarming on so many fronts, so of course there would be a consensus. But on the other hand, the situation in Congress right now is so toxic in a whole different way of the two parties not being able to get anything together.

GRISALES: (Laughter) Yes.

DETROW: Given both of those very different factors, do you see any path toward some sort of legislative fix here, some sort of effort to really regulate Facebook in a different way?

GRISALES: I think that will be very hard to come by. Even though we saw this spirit of bipartisanship, these worries from both sides of the aisle today, they are also very dug-in, both parties, on some very big issues right now. And it's not clear that they're going to get past that and get to something like this in terms of this legislation. Like, we're hearing Blumenthal and others saying, you know, the days of no oversight are over, numbered for Facebook. But getting from there, those statements, to actual legislation, that's going to be really tough to do in Congress today with how partisan really it comes to passing any bills on this.

DETROW: All right, we're going to wrap this conversation up for today. Shannon Bond, thank you so much for coming back on the podcast.

BOND: Always happy to be here.

DETROW: All right. And I'm Scott Detrow. I cover the White House.

GRISALES: I'm Claudia Grisales. I cover Congress.

DETROW: Thank you for listening to the NPR POLITICS PODCAST.

(SOUNDBITE OF THE BIGTOP ORCHESTRA'S "TEETER BOARD: FOLIES BERGERE (MARCH AND TWO-STEP)")

Copyright © 2021 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.