NPR logo

'FiveThirtyEight' Statistician Nate Silver Reports On The 2016 Election

  • Download
  • <iframe src="https://www.npr.org/player/embed/466409697/466437146" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
'FiveThirtyEight' Statistician Nate Silver Reports On The 2016 Election

Political Data & Technology

'FiveThirtyEight' Statistician Nate Silver Reports On The 2016 Election

'FiveThirtyEight' Statistician Nate Silver Reports On The 2016 Election

  • Download
  • <iframe src="https://www.npr.org/player/embed/466409697/466437146" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Silver analyzes polls and predicts election outcomes on his website, FiveThirtyEight. This year's is "maybe the most fascinating nomination race that we've ever seen," he says.

TERRY GROSS, HOST:

With so many political polls coming out, it's hard to evaluate which are most accurate and how much weight we should give them in predicting winners, which is why so many people turn to Nate Silver for his polling analysis. He's a statistician who's famous for founding the website FiveThirtyEight, which analyzes polls based on accuracy and methodology, aggregates polls and forecasts outcomes. In 2008, he correctly called all 35 Senate races and the winners of the presidential contests in 49 of 50 states.

The site's title, FiveThirtyEight, is a reference to the number of votes in the Electoral College. There's a new "FiveThirtyEight" podcast in which Silver and other reporters from the site analyze polls and primary outcomes and talk politics. In addition to political polls, the site reports on science and health, economics, culture and sports. "FiveThirtyEight" is now affiliated with ESPN and was formerly affiliated with The New York Times.

Nate Silver, welcome to FRESH AIR. Polls are supposed to measure public opinion. But do you think political polls also sway public opinion?

NATE SILVER: I think they can. There is evidence of something called the bandwagon effect, which is people like to be associated with a winner. Donald Trump, I think, knows this effect well. Or when you go to any one of his speeches and he spends literally half the speech talking about how great his polls are here and there. He stopped that a little bit after Iowa, but I'm sure he'll start it back up now after he won New Hampshire. So there is some of that. I would say, though, it's more prevalent in the primary than the general election. The reason being in the primary you have multiple candidates. Until recently, you had 17 Republican candidates. People usually like several of those candidates, and they have to coordinate and decide - which one of the six Republicans I really like, should I vote for? And it's usually the guy - probably a guy on the Republican side, especially, who's up in the polls.

GROSS: Why is that?

SILVER: Partly, it's technical. I mean, maybe I like Jim Gilmore, the former governor Virginia. But if I know that only 0.01 percent of people...

GROSS: (Laughter).

SILVER: ...Are voting for him, I'm not really having much of an impact with my vote, whereas choosing between Trump and Ted Cruz and Marco Rubio and Jeb Bush - that will have quite an effect on where the delegates go in my state and how the results are interpreted as we go into other states.

GROSS: You rate the polls based on their accuracy and on their methodologies. Most people don't know the difference between a good poll and a bad poll, an accurate poll and a poorly done poll. Do you think bad polls have a bad effect?

SILVER: Oh, for sure. And you have a lot of bad polls who are dependent upon the good polls to even be remotely accurate. So what they'll do is they'll say well, we have some bad data. Maybe we used a Robo Dialer. Instead of having a human being talk to people, you just set up an automated script. Maybe they'll call 30,000 phone numbers, actually only get 500 respondents, so a very low response rate. And then they'll say, you know, these numbers don't look very good. But Pew Research or the NBC/Wall Street Journal poll or the ABC/Washington Post poll, they have Clinton up 3, so I'm going to tweak my assumptions and have Clinton, let's say, up 4 or 5 instead. That's not what the good pollsters do. What the good pollsters do is a lot more scientific. But there are a lot of bad pollsters that kind of trade on the name and the reputation that, you know, frankly, scientific, but also very expensive, polls have built up over a number of years.

GROSS: If I heard you correctly, you're saying some polls cheat, that they just kind of make up the score.

SILVER: Let's - I would say -I want to be careful about what I would say. Right? I'd say that these polls usually have a lot of choices in terms of how they model turnout. For example, how they determine who's a likely voter and who isn't, how they weight for demographics. And they tend to make choices that are in line with what other polls do, so the technical name for this is herding. The polls sometimes all say the same thing. So if you remember when...

GROSS: That's H-E-R-D.

SILVER: H-E-R-D-I-N-G. So if you remember when then-Senator Clinton upset Senator Obama in the New Hampshire primary eight years ago in 2008, she had been down by 8 points. What was remarkable is that it wasn't just that she was down in one or two polls. It was, like, the same margin in every poll - 8 points in this very, very volatile environment. And once one pollster weighs in, especially a good poll, then people say - you know, what? - I'm not quite sure what's going on here, but I feel more comfortable in the pack. And so of the many different valid models that I might choose from, I'm going to pick this one pollster - pick this one model that is consistent with the consensus.

So ordinarily, in most walks of life when you see all the signs pointing to the same thing, that's a good, reassuring sign. Sometimes, in polling, that can be worrisome (laughter). It means that everyone is making the same assumptions. Everyone is looking at what one another are doing, and therefore, you can have real surprises when every single poll is wrong.

GROSS: How far ahead do you think it's reasonable to try to predict what's going to happen based on polls?

SILVER: So it's a different answer for the general election and the primaries.

GROSS: I mean right now (laughter).

SILVER: You know, I would say about three weeks out, it's worth looking at polls in individual states. But there can be switches right up until the last minute. In 2012, I think in, like, the first 7 states that voted, for the Republicans, the candidate who was ahead in the polls a week before the race, lost. So you can look at the polls, but you need these very big margins of error around them. And it's kind of an exercise in humility in the primaries, whereas in the general election, they have a record of being pretty accurate. So basically, all that we do is we run around during the primaries telling people - be careful. Be careful. The polls are not that reliable. And then in the general election, people are trained to be very wary of the polls. They, historically, have turned out to be very accurate instead. So we kind of flip around and say, you know, actually, I know it's only October, but the polls have been right nine out of 10 times by this point in the year.

GROSS: If you're just joining us, my guest is Nate Silver, who's the founder and editor in chief of FiveThirtyEight, the famous website which grades political polls based on their accuracy and methodological standards and forecasts winners in elections.

Let's take a short break here then we'll talk some more. This is FRESH AIR.

(SOUNDBITE OF MUSIC)

GROSS: This is FRESH AIR. If you're just joining us, my guest is Nate Silver, the founder and editor in chief of the website FiveThirtyEight, which is famous for grading political polls based on their accuracy and their methodological standards, for aggregating polls to see what the terrain, and based on that, for forecasting winners. So the way the primaries and the caucuses are set up now, Iowa and New Hampshire really count for a lot. As a pollster, when you look at voting from, you know, a mathematical point of view, and you try to see, like, who's representative, does it make sense to you to have those two states be the first states and to set the tone for what's to come?

SILVER: I think not particularly. They're not the most representative states. And we've spent time in both states, I like both states a lot as a kind of election tourist, but as I think should be obvious, they're both extremely white states. They don't have a lot of big cities. There aren't really major cities at all in New Hampshire. They both tend to be middle to high income and education levels. And so there's an attempt to rectify that with Nevada and South Carolina. But it is strange that these two states that are, you know, some of the less-representative states of the country as a whole have so much influence over how the rest of the country will wind up voting.

GROSS: Every vote in America is supposed to be equal, but it often seems like they're not, in the primaries and the caucuses - that some votes count more than others. Some states count more than others. And then you've got the delegates and the superdelegates, and the math is done differently for that. For example, you know, Hillary won by the tiniest margin in Iowa and Bernie Sanders won by a really wide margin in New Hampshire, yet she's ahead in delegates.

SILVER: Well, I'd be a little bit careful. I know a lot of news organizations lump together elected delegates and superdelegates when they report the numbers. But remember, in 2008, Clinton began with a large lead over Obama in superdelegates. And that flipped as the election went on because Obama had done well enough where he won the majority of elected delegates, and superdelegates said, I don't want to override this mandate that Obama won from democratic voters. So if Clinton is counting on that to hold up if Sanders - let's say he beats her in Nevada and South Carolina, then I think you'll see panic, number one. But eventually, I think Democrats would say, you know what? Even though we might not want Sanders as our nominee, to override what our voters just told us would be an even bigger and more consequential disaster for the party in the long term. But again, that's just a guess. If you had as big an upset - Sanders beating Clinton would be a much bigger upset than Obama beating Clinton eight years ago. And so to some extent, if that happens, or if Donald Trump wins, we're in uncharted territory in terms of how the party establishments would react.

GROSS: Any other thoughts you have about what would happen if Donald Trump wins and how the Republican establishment would react?

SILVER: I mean, it's maybe the most fascinating nomination race that we've ever seen, or at least that I've ever seen. What's interesting is that in New Hampshire, even though Donald Trump has led basically every poll there since July, only about one-sixth - 18 percent - of the negative ads in New Hampshire were directed against Trump. All these candidates competing to be the moderate or, quote, unquote, "establishment" choice, are sniping at one another, and the front-runner is coasting by and just won 35 percent of the vote, more than doubling the second-place candidate. So ordinarily after a big win in New Hampshire, people would say, well, we were a little bit iffy after Iowa, but now we know that Trump is real. But now we know that everyone and their mother will be coming after Donald Trump. He would be in the spotlight. It might not hurt him. He has some very loyal voters. But at least it might prevent him from gaining further.

But instead, in South Carolina, you have the fourth-place candidate and the fifth-place candidate, Rubio and Bush, in a giant rivalry with one another instead of aiming at the front-runner Donald Trump instead. So by the time they have this figured out, Donald Trump is going to have accumulated more delegates. Ted Cruz will probably have a couple of states in the South. The Texas primary is a lot of delegates - that's coming up on March 1, I believe. And they might all be fighting for second place, or if not second place, a case where they can only win by going to the convention in Cleveland, which could be very, very damaging too.

GROSS: So how accurate are polls now? And here's what I'm thinking. In terms of how technology has changed, a lot of people don't have land lines. There are federal restrictions on how you can use cell phones for polling. You're not allowed to do those kind of automated robo calls. And a lot of people don't answer their phones. They're just going to wait to see, like, who called, and then decide if they want to call them back or not. I know there's some Internet polling. So would you say that the new technology on the whole is helping to make polls more or less accurate? What are the biggest flaws now?

SILVER: Oh no, I have to say that on balance, I'm worried about the state of polling right now. So traditional telephone polls - and let's take the best polls, like Pew Research, for example, who do call cell phones. Their response rates - so how many people actually respond to the poll, the ones they would like to respond - has gone from about 35 percent 20 years ago to I think it is 9 percent now. So they're kind of hoping the 9 percent of people they do reach are representative of the 100 percent of people they would like to reach. In some sense, it's worked remarkably well. I mean, the polls were pretty good in the past two presidential elections. But we've seen cases in other democracies, such as the United Kingdom, for example, Greece, Scotland, Israel, where the polls were pretty far off on election day. And I think that should worry people. Eventually, I think, like everything else, we'll lead our lives online, and online polling will be the standard. But people haven't really figured out what the kind of gold-standard methodology is for online polling.

GROSS: I've heard journalists and pundits counting Twitter followers of candidates and Facebook likes as if they were polls accurately measuring something. I've heard journalists and pundits reciting Google searches of candidates as if that was indicative of something meaningful. Are those accurate measures of anything?

SILVER: I think people should be wary. They're interesting data. We don't have enough history to know which of those tell you anything and which don't, although there was one useful indication that proved to be prescient both in Iowa and New Hampshire, which is that, which candidates are voters searching for on Google? So in Iowa, we saw in the last 24 hours before the caucus, a big surge in searches for Ted Cruz and Marco Rubio. In New Hampshire, we saw a big surge in searches for John Kasich. And all through, those candidates beat their polls on election day in Iowa and New Hampshire respectively. So when you see something going on on the ground or in the data in the states voting next and it's not happening in other states, that to me is pretty meaningful. If you see a big spike in searches for Trump because he said something funny on "Saturday Night Live," then that's a different story.

GROSS: My guest is Nate Silver, founder of the blog FiveThirtyEight, which analyzes polls, aggregates polls and predicts outcomes. We'll talk more after a break. This is FRESH AIR.

(SOUNDBITE OF MUSIC)

GROSS: This is FRESH AIR. My guest is Nate Silver, the founder and editor in chief of the website FiveThirtyEight, which is famous for rating and analyzing political polls and predicting election outcomes. You have also written that you think Donald Trump wouldn't do as well in general election as he's done so far in the primaries. Why are the primaries a better field for him than a general would be?

SILVER: Well, from what best we can tell, Donald Trump has very intense support with maybe a third or 35 percent of the American population. And people outside of that one third of the electorate don't like Donald Trump very much or in some cases, not at all. So right now, out of every candidate, Democratic or Republican, he has the worst favorability ratings with the population as a whole. Now, I think he would probably change and adapt. He's kind of said - warned people - warned Republicans, anyway, that I will probably change my positions and become more of a centrist if I win the nomination. And that could help him potentially. But Donald Trump's popularity is so far confined to an important segment of the Republican base. That might be enough for him to win the nomination if no one else is doing any better. But I'd put it like this - I think he's a very, very high-risk candidate. Having been one of many people to be dismissive of his chances early on, I certainly wouldn't rule out the chance that Donald Trump could become not just the GOP nominee but the next president. He could also be a candidate, though, who winds up losing by 12 or 15 percent in the worst landslide since 1984.

GROSS: So Jill Lepore in her New Yorker article about polling asked the question, are polls good for democracy? What do you think?

SILVER: I think good polls are good for democracy. And the reason why good polls are good for democracy is because it's sometimes, for better or worse, the only chance that regular people have to have their say. You can vote for president, you can vote for senator or for governor, but you can't vote on a trade deal. You can't vote on a treaty. You can't vote on a big tax cut, for example. And so, you know, I'd rather have people have their view represented in polls than misrepresented by reporting on the issue that might, frankly, report the views of elites and not regular people. When you have reporters going up to New Hampshire or Iowa and saying here's the feeling I have on the ground, well, you know, that's interesting local color but it can't possibly do as well as actually opening up a phonebook and randomly sampling hundreds of voters in one of those states.

GROSS: What about the horse race polls, the polls that just measure who's ahead? Are those good for democracy?

SILVER: You know, journalists long before there were polls were always diagnosing the horse race, and they always got behind front-runners. And when you're gaining, it looks like everything is going well for you. When you're slipping, you know, they think you're toast. So if you're going to have horse race coverage at all, I'd rather you take some modicum of polling to do that, you know, because again, the alternatives are just totally the echo chamber where there's no grounding in public opinion at all. But certainly, I'd agree with Jill and other people that maybe things have gotten a little bit echo-chambery with respect to the polls where polls can beget other polls. They can produce momentum. People pile onto the bandwagon, then it reverses all the sudden. Obviously, no one is perfect.

At FiveThirtyEight, though, we try and say, hey, we're going to report on polls but remember number one, that polls are not always accurate. In fact, they're often inaccurate in the primaries and the caucuses. And number two, to say some awareness of, hey, we're aware at FiveThirtyEight that we're a site a lot of people read and that can affect the narrative so-called about the race too. So being aware of how the media is a player and how people understand the horse race and how they understand politics instead of speaking in kind of the voice of God or the third person is really important to us too. So after the debate, for example, in New Hampshire last week, our headline was we thought Marco Rubio did really bad, but we're not sure yet what New Hampshire voters will think. People criticized us for that afterwards. They said, well, shouldn't it have been obvious that voters would think the same way and thought Rubio did badly? But the point is having some separation between what you as a reporter think and what the population thinks and being - having some self-awareness of that is important to the way that we tend to look at polls.

GROSS: Well, Nate Silver, thank you so much for talking with us.

SILVER: Thank you.

GROSS: Nate Silver is the founder and editor in chief of the website FiveThirtyEight. If you'd like to catch up on FRESH AIR interviews you missed, like our interviews with George Miller, the director of the "Mad Max" movies, comic and actor Zach Galifianakis who stars in the new FX series "Baskets," actor Joel Grey who writes about coming out in his new memoir and the film-making duo Mark and Jay Duplass, check out our podcast. You'll find those and many other interviews. FRESH AIR's executive producer is Danny Miller. Our technical director and engineer is Audrey Bentham. Dorothy Ferebee is our administrative assistant. Our associate producer for online media is Molly Seavy-Nesper. Roberta Shorrock directs the show. I'm Terry Gross.

Copyright © 2016 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.