Protect yourself from election disinformation : Life Kit It's almost time to vote. NPR's Miles Parks explains what research can tell us about how to combat fake political news — and why it's so tricky to separate fact from fiction.

Protect yourself from election disinformation

  • Download
  • <iframe src="" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript


This is NPR's LIFE KIT. I'm Marielle Segarra.


SEGARRA: I just don't know what to believe anymore - that is a common thing I have heard and said, honestly, over the past couple years. Whether we're talking about politics or COVID or something else entirely, we are getting served up a lot of information all the time in the news, on social media, from friends and family. And it's just a lot to process. And often, you read things and later find out they're not true at all. But it'll be time to vote soon. And it's important before you do to know what information is reliable and how to filter out the rest. To help us do that, we have Miles Parks, who covers voting for NPR. Hey, Miles.

MILES PARKS, BYLINE: Hey, happy to be here.

SEGARRA: Yeah, happy to have you. So covering misinformation is actually part of your beat, right?

PARKS: It is. And it didn't start that way. You know, I started covering voting shortly after the 2016 elections. But I don't know if you noticed over the last couple years, covering misinformation and covering voting kind of feels like covering the same thing. And that's not just for me as a reporter. I mean, election officials I talk to who run voting at the state and local level say a huge part of their time - you know, 10 or 20 years ago, they used to be spent on just counting ballots and getting ballots out to people - is now spent on trying to get the truth out to people around our voting systems.

SEGARRA: Do you have a sense of why misinformation has become so common recently?

PARKS: Well, technology really plays a part in that. I mean, there's no question that people have easier access to bad information than ever before. But there's kind of an ongoing debate on whether social media, as a platform, is the cause of so much misinformation and polarization in the U.S. electorate or whether it's just kind of the mirror showing what's happening in U.S. society but not playing into it. I think there's no way that you can have this conversation and not talk about former President Donald Trump. I mean, he is the first U.S. president who used the presidency to spread lies about the American election system. And so that, you know, was the most credible figure that we have had in U.S. government spread a lot of those lies. And that has meant that those lies have kind of continued after he has left the presidency.

SEGARRA: So it's kind of a perfect storm.

PARKS: It is. There's a million different things and things we probably haven't even thought about, which is part of why, you know, no one has solved this problem yet.

SEGARRA: Yeah, I was hoping that you'd have some solutions.

PARKS: I wish I could say I did, right? Yeah. It's like, you've been covering this thing for years and years. But I'm not the only one. You know, there are researchers who've been thinking a lot about this for years and years. And I was talking, actually, with an election official in Arizona, Stephen Richer, who runs elections for Arizona's largest county. And I asked him straight up. You know, I said, so what do you do to solve this?

STEPHEN RICHER: Well, if I said I knew, that would be a lie. We have ideas, and we've been trying our hardest to employ all manner of tactics. But I don't think that anyone has cracked disinformation as a societal challenge.

PARKS: Maricopa County, where he runs elections, has been the target of a lot of conspiracy theories, and they've actually done a great job of increasing transparency around their elections process, about getting out in front of voters and kind of debunking conspiracy theories around voting. But I think it's important to realize that this issue has kind of - seems to have got beyond just kind of picking out individual pieces of information and trying to debunk them. And a lot of research nowadays is spent looking at the kind of systemic issues that are driving misinformation as a problem.

SEGARRA: Yeah. And, you know, I want to ask a question because Stephen Richer, he mentioned disinformation, and we've been using the word misinformation. There is a difference between those two, right?

PARKS: Yeah. So disinformation is all about intent. I mean, if you know for certain that a person is spreading a piece of bad information and they are gaining politically, they are gaining financially from spreading that information, we call that disinformation. When it's somebody at a party who just, you know, is saying something they saw on Facebook that's incorrect, we call that misinformation 'cause that person isn't necessarily gaining anything from spreading that information. But it's still bad information.

SEGARRA: Right. So, OK, let's talk about the research. Were there any promising findings?

PARKS: There are, actually, surprisingly so. I mean, it's not all doom and gloom here. One recent test sample that I reported on actually focused on people who feel U.S. elections are fraudulent. I do think it's important at this point to say that is not true. You know, there were audits. There were court cases. There were paper hand counts of the 2020 election. It was considered by both Republican and Democratic officials to be safe, secure and effective. But as we know, due to the former president's campaign against it, there are a lot of people who believe the lies around the 2020 election. This voting advocacy group, the Voting Rights Lab, kind of wanted to test a series of messages, just try to answer the question of, like, how do we bring people back into trusting U.S. elections if they seem kind of leaning towards trusting conspiracy theories instead.

SEGARRA: Does it help to just, like, tell them the facts?

PARKS: No. I mean, that's actually one of the things that they found, was that if you kind of take them step by step into the things they believe and try to debunk each one of them, it doesn't seem to be the most effective way to change people's minds around the whole of the democratic system. The thing that worked that they found was basically presenting them with this kind of unifying, nonpartisan, patriotic statement about America as a country and how important elections are to, you know, America working as a country. And what they found is that presenting people with a statement like that actually had a big effect especially on Trump voters. I talked to Tresa Undem, who led the research on this for the firm PerryUndem.

TRESA UNDEM: And here's what was stunning that I almost never see in social science research or our own survey research, is more conservative voters, after hearing that affirmative narrative, they were, in the double-digit points, more likely to say they trust the process for counting votes in elections compared to a control group.


PARKS: So without that affirmative, patriotic messaging about elections and their importance in this country, just 35% of Republicans said, yes, I trust the counting of votes in this country. But after they were presented with this kind of patriotic narrative about the importance of elections, 55% of Republicans said, yes, I trust how elections are counted in this country.

SEGARRA: So what does that tell us then?

PARKS: Well, I think when we talk about improving this misinformation problem, some of the field has moved on from just focusing on individual lies, you know, fact-checking the heck out of this thing, saying, you know, here's why vote-by-mail ballots are actually counted correctly. I think if this thing gets better, from the researchers I talked to, it's through actually decreasing polarization in U.S. society, which is what that statement that these voters read kind of aims to do.

SEGARRA: So did the researchers also talk to Democrats?

PARKS: Yeah, they did. But I think what's interesting is this sort of statement - this positive, patriotic statement didn't have much effect on Democratic voters because right now, on a whole, almost all Democratic voters trust elections in the United States. We know from research over the last few decades that voters who won the most recent election are much more trusting of the election system as a whole. So I think it'll be interesting to watch, and it'll be important to watch as we're kind of navigating this polarized environment how Democrats feel about U.S. elections in an election where they eventually do lose.

SEGARRA: Right. OK. Well, we are headed into a midterm election. So from your reporting, do you have any ideas or thoughts on what voters might encounter when it comes to bad information?

PARKS: Sure. So, I mean, the trend over the last couple of years as social media platforms have gotten better at policing very obviously false content is towards what's known as gray area misinformation. So this is where somebody shares a factual news story or a true data point or an actual photo but uses it to kind of further their false worldview. And so the example I go back to over and over again is a story I reported right around the time that COVID vaccines were coming out. Well, people who wanted to push the idea that COVID vaccines were killing people - which, again, there was no data to suggest that - they were sharing news articles about people who had died after receiving the vaccine.

Now, as we know, people die every single day from all sorts of causes, and millions of people were receiving the COVID vaccine. So it stands to reason that some people who received the COVID vaccine would then die from causes that weren't necessarily connected to the COVID vaccine, right? But people were able to share true news stories and plant those seeds of doubt about the vaccines. In those situations, there's very little that the social media companies can do to police it because they're sharing a true news article. It just happens to be furthering this false narrative.

SEGARRA: Right. It's pretty nuanced. So how do people protect themselves from this kind of thing?

PARKS: I think the biggest thing that people can do when it comes to gray area misinformation is be on the lookout when you're scrolling or you're reading and you see a piece of information that just fits so neatly into your worldview, and it makes you angry. It makes you upset. It makes you have a really strong emotional reaction. We know that people are less kind of critically inclined to think about the information they're receiving when those kind of anger - those strong emotions come in. And so people who are making money off kind of polarizing - getting clicks and polarizing the American electorate try to get you upset, try to get you angry.

And so I think people - when you feel those emotions, instead of being an indicator that you should kind of rush off to text that article or that photo to 10 people you know, take that anger or that sadness or whatever you're feeling as an indication that you should maybe double-check the source or double-check the actual kind of key nugget of that information.

SEGARRA: Yeah. And I think we can all relate to that.

PARKS: Yeah.

SEGARRA: So if you're upset, take a moment. That makes sense. Anything else that we should think about?

PARKS: I was actually talking to Steve Simon the other day - he's the secretary of state of Minnesota - and I was asking him exactly that. You know, do you have any tips for people ahead of the election when it comes to misinformation and thinking about the information ecosystem? And what he said is that we as a public really need to distinguish between the purveyors of misinformation and the recipients of it.

STEVE SIMON: I distinguish and I advise others to distinguish between the superspreaders of the disinformation, who are doing it often knowing that it's false, doing it for political purposes, economic purposes, sometimes both - on the one hand, there's those folks. And on the other hand are everyday folks who most of us know - friends, neighbors, coworkers, relatives - who might not know what to think and are taken in by some of this. And making that distinction is important.

PARKS: I think that point here is the one that I take away from this, that it's not - even if you don't think you're somebody who is consuming misinformation all the time, if you think of polarization as the bigger problem here and the idea that we should just have empathy for the people who potentially are reading bad information or are susceptible to this stuff, then I think it's something that every single person can kind of work on in terms of taking the temperature down a little bit.

SEGARRA: Yeah. Well, I wonder if, like, let's say a family member shares something with you that seems false. What's the approach? Do you look it up and see, OK, so this isn't true, and then send them a link fact-checking it?

PARKS: I mean, I think it is kind of case-by-case basis because I actually get that question a lot. If it's somebody who you're really close with, who you have that sort of relationship with, then sometimes being direct actually can help the cause, you know, potentially giving them a link or talking through it right then. Most cases, it's not like that. You know, it's a person who you might not be necessarily talking to on a day-to-day basis. And what researchers I've talked to say to do in that situation is more draw it out. Like, don't necessarily go be so quick to yell or, you know, send an angry email. Ask questions. You know, be curious.

And I think if you can kind of get to the root of the piece of information, whether that's talking about not necessarily the piece of information but the source of it and kind of talking about that website or talking about that person and kind of having a broader discussion - if you start lecturing somebody on this stuff, it's just not usually going to work. I think thinking of it as a conversation and asking questions is probably the way that has the most success.

SEGARRA: Yeah. Yeah. It seems like we're in a - kind of a tangled mess here, and we're going to need to figure it out together with some kindness.

PARKS: It's a really confusing time to try to get good information right now, and I think bringing a little bit of that empathy and that understanding - that, like, this could be you - is helpful.

SEGARRA: Totally. All right. Well, NPR's Miles Parks, thank you so much.

PARKS: Thank you for having me.


SEGARRA: For more LIFE KIT, check out our other episodes. Miles has one about how to vote and another that goes deeper into detecting misinformation. You can find those at And if you love LIFE KIT and want more, subscribe to our newsletter at This episode of LIFE KIT was produced by Clare Marie Schneider. It was edited by Ben Swasey. Our visuals editor is Beck Harlan. Our digital editor is Malaka Gharib. Meghan Keane is the supervising editor. Beth Donovan is the executive producer. Our intern is Jamal Michel. Our production team also includes Andee Tagle, Audrey Nguyen, Michelle Aslam, Summer Thomad and Sylvie Douglis. Julia Carney is our podcast coordinator, and engineering support comes from Ko Takasugi-Czernowin. I'm Marielle Segarra. Thanks for listening.

Copyright © 2022 NPR. All rights reserved. Visit our website terms of use and permissions pages at for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.