Nate Silver On Missed UK Forecasts: We Flubbed The Margin Of Error Data guru Nate Silver of FiveThirtyEight.com tells NPR's Scott Simon how all the forecasts, including his own, were so far off in predicting the results of this week's British election.
NPR logo

Nate Silver On Missed UK Forecasts: We Flubbed The Margin Of Error

  • Download
  • <iframe src="https://www.npr.org/player/embed/405415749/405415750" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Nate Silver On Missed UK Forecasts: We Flubbed The Margin Of Error

Nate Silver On Missed UK Forecasts: We Flubbed The Margin Of Error

Nate Silver On Missed UK Forecasts: We Flubbed The Margin Of Error

  • Download
  • <iframe src="https://www.npr.org/player/embed/405415749/405415750" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Data guru Nate Silver of FiveThirtyEight.com tells NPR's Scott Simon how all the forecasts, including his own, were so far off in predicting the results of this week's British election.

SCOTT SIMON, HOST:

Among the many pollsters and pundits who were oh so wrong were Nate Silver, the statistician who's founder and editor-in-chief of FiveThirtyEight, the website owned by ESPN and ABC that analyzes data and forecasts elections and sports. Of course, Mr. Silver became famous for correctly forecasting President Obama's election in 2008 almost down to the last elector, the number 538. And this week, he predicted an incredibly messy outcome in the British elections, a chaotic nightmare scenario and forecast that Conservatives would win 283 seats, Labour 270. He was only off by about 50 seats. Nate Silver joins us now. Thanks so much for being with us.

NATE SILVER: Thank you.

SIMON: So what happened?

SILVER: Well, I should clarify that this is not really my forecast. This was put together in a partnership with a group of U.K. academics. But, you know, we thought it was going to be a pretty good model.

SIMON: But, I mean, I was impressed by - on your website, you said you didn't want to make any excuses.

SILVER: I don't want to make excuses for the forecast. But at FiveThirtyEight, we're trying to build a team here, right? And it's not just about me anymore, and we deserve credit as a team and blame as a team.

SIMON: Right.

SILVER: But I'm the leader of the team, and so I deserve - you know, I think you always want to take more blame than credit. I mean, it is a model and, you know, kind of what are the assumptions in the model that were wrong? I mean, that we can talk about.

SIMON: Well, what do you think were the assumptions in the model?

SILVER: I think for one thing, you'd actually seen cases like this occur in the past, where Conservatives did quite a bit better than their polls. So it's the combination of the polls being off, plus the kind of traditional swing calculations don't apply so well in an election where you have four or five parties that are relevant.

SIMON: You said on your website you didn't want to make any excuses and added it's a good lesson as we begin to plan our coverage for the 2016 U.S. election.

SILVER: Yeah, absolutely. I mean, to me, part of the forecast is saying what's the most likely outcome? But it's also how you specify the range of uncertainty. So...

SIMON: Yeah.

SILVER: ...The right forecast probably wasn't (laughter) - 'cause no one's clairvoyant, it probably wasn't that, oh, it's going to be a conservative sweep. It was probably that hey, we don't know a lot and maybe there's a 15 or 20 percent chance that somehow they can maintain their majority. So I think, you know, when I say that I think our forecast was wrong and I don't want to make excuses for it, I think the part that was wrong is the part that was too narrow in terms of the margin of error.

SIMON: Yeah. I've got to tell you, Mr. Silver, you're not giving people much of an incentive to read FiveThirtyEight seriously if you're essentially backing away from the idea that you can reach any conclusions. That's why people read you.

SILVER: Sometimes the right conclusion is to say that people are too sure of themselves, right?

SIMON: Yeah.

SILVER: Sometimes it's to anticipate uncertainties in your environment. And over the whole course of our life span at FiveThirtyEight - and we've predicted many things in politics and sports and other events for many, many years - historically, one thing that's distinguished us is that our probabilities have been right over the long run. That when we say something has an 80 percent of happening, it happens about 80 percent of the time and it doesn't happen about 20 percent of the time. And often we're the people saying you know what? There's more uncertainty in the outlook than people would concede. It's not as obvious as just kind of picking a poll off the shelf.

SIMON: Yeah. What are some of the mistakes with polling these days?

SILVER: Well, I mean, one of the problems with polling is that most people aren't just going to sit there at home and respond to someone's call on a landline, right? People screen their calls; people have mobile phones. Only about 10 percent of people respond even to the best polls. Some polls have gone online, but unlike a phone number in the old days where everyone had one, how you kind of ping someone online is more complicated. So you can get a sample, but it might not be a random sample that actually resembles the voting population. We rely on polling not just to forecast elections - that can be fun - but also to understand how people feel about important issues from health care to the economy to abortion to gay marriage. They're a key conduit to our democracy. And if they're failing, it's a lot more important than the game show of which pundit or which forecast does best in a particular election.

SIMON: Nate Silver, the founder of FiveThirtyEight. Thank you so much for being with us.

SILVER: Thank you.

Copyright © 2015 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.