The Spread Of Anti-Vaccine Misinformation : Short Wave In the second of two episodes exploring anti-vaccine misinformation online, Renee DiResta of the Stanford Internet Observatory explains why the Internet is so good at spreading bad information, and what big tech platforms are starting to do about it. Listen to the prior episode to hear more from Renee, and the story of pediatrician Nicole Baldwin, whose pro-vaccine TikTok video made her the target of harassment and intimidation from anti-vaccine activists online.

You can see Dr. Baldwin's original TikTok here.

Renee DiResta has written about how some anti-vaccine proponents harass, intimidate, and spread misinformation online here and here.

Email the show at shortwave@npr.org.
NPR logo

Vaccines, Misinformation, And The Internet (Part 2)

  • Download
  • <iframe src="https://www.npr.org/player/embed/809601288/809800302" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Vaccines, Misinformation, And The Internet (Part 2)

Vaccines, Misinformation, And The Internet (Part 2)

  • Download
  • <iframe src="https://www.npr.org/player/embed/809601288/809800302" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

MADDIE SOFIA, HOST:

You're listening to SHORT WAVE...

(SOUNDBITE OF MUSIC)

SOFIA: ...From NPR.

If you missed yesterday's episode, you missed the story of Cincinnati pediatrician Nicole Baldwin and TikTok.

NICOLE BALDWIN: So I like the music. I think it's super fun to watch people being goofy and dancing and all that kind of stuff.

SOFIA: TikTok is a social media app where people basically do that - be goofy to music.

So who follows you on TikTok, do you know?

BALDWIN: My daughter (laughter).

SOFIA: Important; go on.

BALDWIN: Yeah, well...

SOFIA: I'm sure that's not horrifying for her in any way.

BALDWIN: Yeah. I - it's funny dinner conversation around our house, but I think a lot of other physicians right now are following me. I definitely do have - some adolescent patients, in the past couple weeks, have come into the office. And they're like, I saw your TikTok. I follow you on TikTok. So that's fun because that's who I'm trying to reach - is that population with some of these messages.

SOFIA: By messages, Nicole means posts about family health. And last month, one of them - a post of hers promoting the importance of vaccination - went viral and not entirely in a good way.

BALDWIN: I was scared. And we did, even at home, call the police just to have them do extra patrols around our house.

SOFIA: In the last episode, we examined how anti-vaccine activists harassed Nicole through social media, eventually finding her office and threatening her practice. This episode, we're going to explore why the Internet is so good at fueling misinformation.

RENEE DIRESTA: You know, the Internet is really good at helping people find other people like them.

SOFIA: Renee DiResta, who you also heard in the last episode, is the research manager at the Stanford Internet Observatory. She studies the spread of misinformation through what she calls inadvertent algorithmic amplification.

DIRESTA: And what that means is the recommendation engines or the trending function or the search function - we know that these are kind of features that send a lot of eyes in the way of certain information. And so the question has been, do the platforms have an obligation to ensure that the information that they're sending people to is quality information? And there are some people who think that the answer is no, that...

SOFIA: (Laughter) right.

DIRESTA: But my belief is that we should not be in a world where the most popular website takes top billing on Google, particularly when the notion of most popular is derived from easily gamed metrics.

SOFIA: So today Renee DiResta helps explain how people game the Internet and how the Internet, in a way, games us when it comes to spreading bad information online.

I'm Maddie Sofia, and this is SHORT WAVE, a daily science podcast from NPR.

(SOUNDBITE OF MUSIC)

SOFIA: OK, so let's start back in 2013. Before she studied misinformation as it relates to vaccines, Renee DiResta was a new mom, just new-moming (ph) on Pinterest.

DIRESTA: I would save, you know, baby recipes and things. And then, of course, once you'd indicated an interest in infants or infant health - or, you know, we did some things that fell on the more crunchy end of the spectrum.

SOFIA: (Laughter).

DIRESTA: We would then (laughter) - you know, I made my own baby food and cloth diapered, which, I think, makes you like a homesteading hippie. But it's interesting because the social platforms do key off of signals that you give them. It's called collaborative filtering. And it's why recommendations get better because instead of only keying off of things that you have personally already typed into a search engine - meaning if I am a gardener and I type in gardening, it makes sense for it to give me more gardening stuff. But even if I have never typed in the word hiking, if there is a sufficient overlap between my behavior and my interests and someone else on the Internet who is sort of similar to me in browsing behavior or the content that they enjoy...

SOFIA: Right, right.

DIRESTA: I am going to start to see recommendations that those other types of people who it thinks are similar to me have also enjoyed.

(SOUNDBITE OF MUSIC)

SOFIA: So for Renee, in the early days of Pinterest, collaborative filtering meant because of her interest in cloth diapers and baby food recipes, she also got served up anti-vaccine content. This was a big problem on Pinterest. One study in 2015 found that 75% of vaccine-related posts on the platform were negative. Early last year, Pinterest finally addressed that problem, announcing it would only return vaccine-related searches with information from trusted public health sources. But it's not just Pinterest. Renee says from the earliest days of social media, bad health information has been rampant.

DIRESTA: It wasn't just anti-vaccine narratives. It was actually kind of conspiratorial, sensational, twists-on-the-truth kind of narratives that were framed in an extremely clickbaity (ph) kind of way - the things your doctor doesn't want you to know - you know, that sort of stuff. And those sorts of things were performing very well, and that's because there was a sensational headline. And it was getting tons of clicks and likes and shares and comments. That would be surfaced to more people. This sort of creating an app or an environment where users would keep coming back because there was always something stimulating and engaging was also something that was good for business for them.

SOFIA: Another way the Internet makes it possible for you to get bad information in the course of your search is a phenomenon known as a data void. That's when a specific search term or keyword turns up little or no results, and that can be exploited by people who want to spread misinformation. They can produce content that fills that void, so when somebody searches for that specific term, almost all the information they get is bad information.

DIRESTA: What I saw when I was looking at it in the context of health communication is if you searched for the vitamin K shot - which is an extremely routine intervention. It is not a vaccine. It is a shot that prevents babies from hemorrhaging, so it provides a vitamin that facilitates clotting, which is something that babies are not born with. And so when you searched for vitamin K, the top Google results and almost 100% of the social media results would, instead, surface articles by bloggers, particularly anti-vaccine bloggers, about why they had refused the vitamin K shot. And they kept calling it things like the vitamin K vaccine, and they would tell you that there were still toxins and preservatives, and this was going to harm your baby and your baby didn't need it. And this kind of led to - this was - I think this was one of these kind of pivotal moments where people began to, you know, to think about the impact that these data voids, that these bad search results, that this health misinformation was having...

SOFIA: Right.

DIRESTA: ...On the lives of real children.

(SOUNDBITE OF MUSIC)

SOFIA: This brings us back to a question Renee raised earlier - whether we should live in a world where the most popular information can be bad information. Google, for its part, has decided the answer to that question is no.

DIRESTA: Which is why they have a policy called Your Money or Your Life - and under the Your Money or Your Life policy, what they say is that search results related to financial or health information have to be held to a higher-quality standard. And they have...

SOFIA: Facebook has also made a decision last year not to accept advertising dollars from accounts promoting health misinformation. And lately, the company is trying to address big public health issues like the novel Coronavirus with a framework called remove, reduce, inform.

DIRESTA: They can either remove the content, reduce its spread or inform people that it's false or misleading.

SOFIA: So, basically, if you share a piece of misinformation, Facebook might tell you, hey, that thing you shared isn't true.

DIRESTA: Those are the three options that they have on the table, and that's just Facebook. This is something that happens with YouTube...

SOFIA: Sure.

DIRESTA: ...with Twitter, with Pinterest, with - I mean, you name it. They all, at this point, have some sort of protocol that they use for thinking about health misinformation, of course, but also misinformation during a crisis situation.

SOFIA: How much of the onus do you think is on the platform themselves to engage with this content and to kind of mediate this?

DIRESTA: That's an interesting question. So the platforms - they are not liable for the content that they host, and that's because that was a decision that was made back in the 90s - 1996, when they were just being - just kind of coming into being. There was a concern that if you made the platforms liable for what people shared on them, they would be kind of sued out of existence, so that's the liability framework that they operate under - is that they have the right to moderate. They have the freedom to moderate, but they don't have the obligation to moderate. And so for a while, that led to a little bit of, like, a kind of laissez-faire, hands-off approach to what was moderated, particularly because they did want to maximize freedom of expression and, also, they didn't want to have to deal with the hassle of moderation.

SOFIA: Yeah, that sounds right.

DIRESTA: Nobody likes moderating, as it turns out.

(LAUGHTER)

DIRESTA: People just get mad at you, you know?

SOFIA: Right.

DIRESTA: But the realization that there were certain societal harms that could be traced back to the absolute unfettered use of these platforms, whether that was election manipulation, voter suppression, health misinformation, cancer quackery - all these different topical areas that public sentiment, in particular, and then regulator attention both kind of came together in 2017. And that's where, I think, you started to see the real push for accountability, so it's been kind of an ongoing effort, pretty major shift over the last two years - and more work to do.

SOFIA: One more note on the anti-vaccine part of the story - it can seem, sometimes, like anti-vaccine proponents are running this sneaky, smart, underground campaign, operating in dark corners of the web, deploying tactics that are hard to understand and hard to fight. Renee DiResta said, actually, it's just the opposite.

DIRESTA: I don't think that there's very much that they do that any marketing, particularly guerrilla marketing or a social media marketing manager wouldn't have thought to do. It's more the way that they're able to capitalize on the diehard, true-believer community, particularly for financial resources, that makes them very interesting, in part, because they're extremely well-resourced and very well-funded. And so that's where, I think, in particular, the World Health Organization, the CDC, doctors, et cetera - like, the doctors who are producing the counter-content in particular, they are not being funded (laughter). And they're doing it out of passion and out of, you know, their own, kind of, spare time. And so it's an interesting, kind of asymmetric challenge to think about, you know, the ways that more resources can be deployed to grow the counter-movement.

SOFIA: Dr. Baldwin, the doctor we talked to who was targeted by anti-vaccine proponents, is one of those doctors out there using her own passion and time to fight misinformation online. Not only is she a pediatrician from Cincinnati, now she's also one of the faces of a counter-content effort in a vast information war.

You've done a lot of press about this. This is far from your first interview.

BALDWIN: Yeah.

SOFIA: And so now when people Google you, everything that comes up is...

BALDWIN: ...About this.

SOFIA: ...About the stance you've taken and not necessarily about, you know, the post or the - or your doxed Yelp page, thankfully.

BALDWIN: Yeah. It's interesting because I didn't necessarily sign up to be the poster child for vaccine advocacy right now. But I do think that because of this and because I was able to kind of stand up and call them on what happened - I think that was another thing that, not only did I kind of face that attack but I also put it out there like, hey, guys, this is what they did. So I feel like people are rising up and saying, OK, we've got to take back this space. We've got to take back the social media space, and I definitely am feeling very strongly about that, too.

(SOUNDBITE OF MUSIC)

SOFIA: Thanks to pediatrician Nicole Baldwin and also to Renee DiResta at Stanford. As we did yesterday, we'll put links to some of Renee's reporting and research on misinformation online in our episode notes. This episode was produced by Brent Baughman, edited by Viet Le and fact-checked by Emily Vaughn. I'm Maddie Sofia. We're back with more SHORT WAVE from NPR tomorrow.

(SOUNDBITE OF MUSIC)

Copyright © 2020 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.