Bloomberg pollster J. Ann Selzer ignited something of a political firestorm this week when her national survey for the news organization showed President Obama leading GOP presidential candidate Mitt Romney among likely voters by 13 points, 53-40 percent.
Most recent polls have shown the race much closer.
Some have Obama with a slight lead, others have Romney up by a little. A Pew Research Center poll released Thursday, for example, shows Obama up 50-46 percent, while the same-day Gallup daily tracking poll has Romney leading 47-45 percent.
The results of Selzer's survey, not surprisingly, have spawned countless columns, blog posts and questions about the results, her methods and what we should make of it all.
Did the respected Des Moines-based pollster talk to too many highly educated voters? Did she affect the results by including in the survey questions about what people thought about former President Bill Clinton and developer Donald Trump? Should the poll be dismissed as simply an outlier?
The reaction and dissection had become so fevered in the world of political pundits and poll analysts that Bloomberg posted a story late Thursday taking a deeper look at the poll.
The Bloomberg piece says that the higher-than-average number of college educated respondents in the poll "may have produced a higher level of support for Obama."
Selzer, quoted in the story, said that though college graduates were represented in the current survey by 6 percentage points more than 2008 exit polls, "every education subgroup votes for Obama over Romney."
"The two groups that deviate most from the 2008 average — some college and college degree — are where the race is closest. It is simply not true that the higher education of our likely voter sample, compared to the 2008 electorate, is the reason for our higher margin for Obama," she said.
Selzer says the reality is "we won't know whether this poll is or is not an outlier for a while. What we do know is that using the same sampling method, same weighting, procedure, same question working in roughly the same order, Obama has opened a lead over Romney when we had them tied in March."
In addition to the Bloomberg story, Selzer also released a memo in which she provides her answers to questions raised and theories posed about the poll's methodology. The suggestion that the poll surveyed too many educated people, she says, is the only one that "holds water," but she adds that Obama still would have had a double-digit lead in the poll even if the results had been further adjusted for the oversample.
She also points out that one problem in comparing polls is that they often look at different groups — in this case, the poll looked at likely voters rather than the wider group of registered voters.
Here's the memo, which gives a fascinating glimpse into the high-stakes polling process:
Some Questions and Some Answers about the Bloomberg National Poll
J. Ann Selzer, Selzer & Company
June 21, 2012
When people ask how to make sense out of a bevy of widely varying polls, I say, pick a poll and follow it over time. The trend will tell you something meaningful when you compare the same method and the same question with the same universe over time. Our Bloomberg National Poll in March, before the Republican nominating contest had concluded, showed Mitt Romney and Barack Obama tied at 47% each. In our June poll, released Wednesday morning, we showed a 13-point lead for Barack Obama—53% to 40%.
Why does our poll look different from other recent polls? Here are answers to a few theories posited in the last day or so about why our poll might be an outlier.
Did you change your methodology? No. It was the same method, the same question, and the same universe. The poll produced roughly the same demographic make-up as our March poll.
Was this a poll of likely voters or registered voters? One of the problems of comparing one poll to another is each may have its own way of looking at the electorate. We ask how likely respondents are to vote in November and if they say they will definitely vote—the top box on a four-point scale—we define them as likely voters. Seventy-two percent in our current poll pass that screen. Some polls ask if the respondent is registered to vote. That is a larger universe and includes people who know today they have no intention to vote, so we prefer a narrower definition of a likely voter.
Did you load up the front of the questionnaire with questions that would tilt uncertain voters toward Obama and away from Romney. No. Our poll asks the horse race after the right/wrong track question about the nation (which is 2-1 pessimistic), after a question about the most important issues facing the country (unemployment and jobs), after favorable/unfavorable impressions of a number of personalities and institutions, and after presidential job approval on a number of elements (including the economy, job creation, and policies on trade with China—which deliver bad news for the president). The March poll included the questions in the same order as the June poll. We conclude the question order did not manufacture a 13-point lead for Barack Obama.
Did you interview too many younger voters who tend to like Obama more? No. Sixteen percent of likely voters in our poll are age 18 to 29, about the same as in the 2008 election.
What about race? Did you under-represent the white vote? Possibly, but only a little. Our likely voter sample includes 69% who are white, 12% who are black, 10% who are Hispanic, and 5% who are some other race. The exit polls for 2008 showed whites to be 74% of the electorate, blacks to be 13%, Latinos to be 9% and other to be 5%. Our poll includes a place to capture those who refuse to give an answer (4%). If we remove those and repercentage the results, whites grow to 72%. This is a contest where race is a strong predictor of vote (the white vote goes for Romney 50% to 43% and the non-white vote goes for Obama 78% to 16%. So, is the small difference in race a reason for a 13-point difference overall? It may contribute, but it is not the sole difference.
Too many Democrats? Our poll showed more self-identified Democrats than Republicans which is in line with both our March poll and other polls. Among likely voters, our spread is 42% Democrat (including independents who say they lean toward the Democratic party) and 37% Republican (including leaners) with 19% saying they are totally independent. That is a +5 margin for Democrats compared to a Pew Research average of +8 for Democrats. It is not the case we interviewed an extraordinarily large number of Democrats, accounting for a big margin for Barack Obama.
Maybe you had a higher-educated respondent pool and they tend to like Obama. Maybe we did. Of all the theories, this one holds some water. The 2008 exit poll shows 24% with a high school education or less, compared to our 20% among likely voters; the 2008 electorate had 31% with some college and we had 23%. In 2008, 28% of voters had a college degree and 17% more had some postgraduate education; we had 34% with a college degree and 21% with some postgraduate exposure. However, in our poll, every education subgroup votes for Obama over Romney.
We played around with the data to test whether our findings would have changed had our education distribution looked more like the 2008 exit poll. By "played around," I mean we created a 52-cell weight variable accounting for age, race, and education. The presidential contest becomes a 10-point race: 51% for Obama and 41% for Romney. It is still a double-digit lead for Obama and would likely have created as much stir as our 13-point lead.
In the end. We will soon know whether this poll is, in fact, an outlier. Potentially, this poll caught the electorate when the wind was at Barack Obama's back for a brief moment in time.