Disinformation On Facebook Could Affect The Way You Vote
LULU GARCIA-NAVARRO, HOST:
Facebook is in the spotlight again. It's been promising to take major steps to limit disinformation and interference before the election. And it promised that after this election, it won't allow posts claiming false victory by the candidates. Facebook has been facing a double whammy of election disinformation recently, with some of it on the site coming from political campaigns and others from foreign or domestic bad actors trying to influence us.
But does what we see on Facebook actually impact the way we vote? We're joined now by Jessica Feezell. She is a professor at the University of New Mexico who studies the intersection of social media and political behavior.
JESSICA FEEZELL: Thank you, Lulu. Thank you for having me.
GARCIA-NAVARRO: It is a pleasure to have you. Much, as you know, has been said about Facebook's role in politics and elections. But does Facebook really have an impact? How much do people actually take away from what they see on the site?
FEEZELL: So we know several things. We know that people who are politically engaged online are usually also politically engaged offline. So the information that we encounter definitely feeds into sort of our political ecosystem and impacts the way that we behave in real - you know, feet-on-the-ground, traditional political participation.
We also know that the information that we encounter on Facebook has an effect on what we call agenda setting - or the issues that we think are important at any given time. And that can have implications for the issues that we take into the voting booth with us or the things that guide our electoral decisions.
GARCIA-NAVARRO: Does your research then show that people would change their vote based on these sort of incidental contexts?
FEEZELL: No, it doesn't. Changing someone's vote is a very difficult thing to do because you have all of these other socializing forces - religion, gender, education, region, personal experiences. All of these things come into play when we act politically. It could, however, make you think about certain things that make you feel ineffective or upset and sad about democracy, and maybe then you don't go vote. But the likelihood that you bump into some piece of information on Facebook, and it changes who you vote for is very unlikely.
GARCIA-NAVARRO: You talk about these sociological forces that are based on your gender, your religion, your age. What kind of person do we know is more likely to be affected by something they see or read on Facebook? Is there a particular demographic that is more vulnerable to being influenced?
FEEZELL: So usually, when we - you know, we model this stuff and look at it empirically, there's a number of things that we control for - many of the things that you just listed but also political interest - how engaged they are, how invested, how, perhaps, knowledgeable they are about the political world around them. It's the people who have lower levels of political interest who are most susceptible to incidental exposure to political information, whether it's good, well-crafted, quality news reporting or mis- and disinformation campaigning.
GARCIA-NAVARRO: But aren't those also the people that are less likely to vote?
FEEZELL: Perhaps. And there is some research by Leticia Bode that shows that it's possible that people who go online and bump into information and start to participate lightly online might sort of open up a gateway to higher levels of offline participation.
So, for example, if you're online and you see a friend that you trust and really like sharing information about Black Lives Matter, for example - you think that might be interesting, and you trust this person. And so perhaps then you dig into more information about it, and you become activated by it. And maybe even it's an issue that resonates enough with who you are that it drives you to participate. It drives you into voting.
GARCIA-NAVARRO: There's been a lot in the news recently about steps that Facebook has said it will take, is taking to sort of combat disinformation. Some of that has to do with taking down particular pages. It has also said it will limit new ads, political ads, in the week before the election. Is that enough?
FEEZELL: So I'm happy that they're doing something. What they're doing, however, I don't believe is enough. I think that they should go further than one week and perhaps take it to a full month and pull down the political ads and let Facebook be a place of social engagement and discussion and sharing among people but not necessarily a place where people are able to pay for that speech.
It would also be nice if instead of being retroactive in the information that they take down and remove, that there was some way that they could be more proactive in keeping that information from appearing in the first place because we know that when people are exposed to disinformation and misinformation - that it's quite difficult to actually correct that. There's still some lingering seed of doubt about that issue that you saw even if you've seen that that has been fact-checked and proven to be wrong.
GARCIA-NAVARRO: Jessica Feezell, a professor at the University of New Mexico, thank you so much.
FEEZELL: Thank you.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.