One of the rarely admitted secrets about journalists is that many of us are functional "innumerates" — another way of saying "mathematically illiterate." Oh sure, we can add and subtract reasonably well. But with some exceptions, journalists generally don't know, understand or aren't interested in numbers. As for more complex subjects such as statistics and probability, well... many journalists would be hard pressed to tell the difference between "average" and "mean."
Is this because so many journalists started off getting an arts degree rather than one in the sciences or engineering? Whatever the reason, we are especially reminded of our innumeracy during election years, when poll results start to flood into the newsroom.
Many listeners (and journalists) are confused by polls and pollsters: by the plethora of statistics, by the "plus or minus 4 percent" mantra chanted by pollsters draped in the robes of electoral oracles as they perform their ritual interpretations of the numbers.
For the next five weeks until the election, NPR, along with other media, will report the results of public opinion polls. Many journalists and listeners will wait eagerly to hear whether Sen. Kerry can close the gap with President Bush. The elements of the electoral horse race will be presented as statistical truth every few days.
Bamboozled by the Pollsters?
But other listeners feel that the media's innumeracy permits journalists to be bamboozled by pollsters. They say that journalists have not done enough to explain how the polls are conducted, why we should believe what the polls are saying or what the poll results may portend for the election's outcome.
Some listeners believe that NPR may be reporting poll results out of context. No information is given about the number of people interviewed (the sample size), nor is the statistical degree of accuracy (the margin of error) of the polls explained well.
These listeners feel that simply reporting on a poll that shows President Bush in the lead implies an inevitable victory by the Republican.
Listeners' confusion is increased because no two polls are numerically alike, although many resemble one another by showing President Bush slightly ahead of Sen. Kerry.
What is clear is that many listeners are unclear about how the polls are conducted and whether they legitimately predict what will happen on Nov. 2.
So what is a poll supposed to do?
Rules for Reading Polls
One of the clearest explanations of what is trustworthy and what is less trustworthy about polling comes from professor Kathleen Woodruff Wickham of the University of Mississippi in Oxford.
Wickham says that voters need to ask the following questions whenever a poll is published:
Who sponsored the survey?
How many people were interviewed (sample size)?
Were the interviews done with a random or a self-selected group? This is important because of the increased use of Internet polling, where people may be more motivated to respond than if they were telephoned at random.
What is the wording of the question? Is the language emotional? Neutral?
What is the raw data? Do the math yourself, if you can.
I approached one of NPR's resident High Priests of Polling, Senior Editor Marcus Rosenbaum. He has been involved with NPR polls done with the Kaiser Family Foundation and with the Kennedy School of Government at Harvard University. He also examines how NPR reports on other polls. Marcus is one of the few people who, in my opinion, can explain polls and polling in a cogent manner.
How Do We Know?
I asked Marcus, "How do we know whom to trust when it comes to polls that are often at variance with each other?"
Marcus responded that most polls tend to "trend" in similar directions over the same period of time. They rarely point in opposite directions, although that can happen.
"Polling", says Marcus, "attempts to provide a statistically reliable snapshot of public opinion from the moment they were taken. These polls are not predictive about the final outcome on Election Night. For most of them, they seem to trend in the same direction, even if their numbers aren't the same."
There is also something known as a "rogue," or an "outlier" poll. That's the one that stands out by heading in a different direction from the others. What's amazing is that sometimes the rogue poll can prove right while all the others were wrong. This has happened in the past. Recently, though, polling has become more reliable, which polling organizations attribute to more sophisticated approaches to sampling, question phrasing and testing.
Marcus added that social scientists are divided over whether polls actually influence voters. But polls do have an impact on how people behave as voters. "In an election this close, the goal is not to influence the so-called 'undecideds,'" he said. "It's to get your supporters out and, at the same time, to turn off your opponent's supporters."
Polling once was thought to push the "undecideds" toward the leading candidates, on the premise that everyone wants to back a winner. Marcus disagrees. He thinks that there may be just as much motivation for an undecided voter to support the underdog, especially in a close race.
I am somewhat less trusting of polls than Marcus is, although he always makes a strong case for their reliability. He may even have made me slightly less innumerate as well.
Nevertheless, I think a skeptical approach to polls is necessary, especially when a race is as close as this one. Many polls will be released over the next few weeks making claims and counterclaims for the candidates.
NPR will serve its listeners, and any innumerate journalists, by not reporting every poll that comes out. That would emphasize the horse-race aspect of the election and play into the hands of the augurs. NPR instead might consider reporting any changes in campaign policies, strategies or tactics made in response to what the polls are saying.