DAVID GREENE, HOST:
Okay. You'd like to believe that people in the intelligence community, professionals with access to classified reports, are really good at predicting what might happen next in the world. But we're about to hear about an experiment suggesting that regular citizens like you and me might actually be better at it. The study was funded by the intelligence community. It's been going on for three years, and even people involved have been astonished by the results. NPR's Alix Spiegel begins with one of the program's best forecasters.
ALIX SPIEGEL, BYLINE: The morning I met her, Elaine Rich was sitting at the kitchen table of her small town home in suburban Maryland trying to estimate refugee flows in Syria. Now, this wasn't the only question that she was considering that morning. There were others. Would North Korea launch a new multistage missile before May 10, 2014?
By April, would the government of Syria and the Syrian supreme military command announce a cease fire agreement? And how about Russia? Would Russian armed forces enter Kharkiv in Ukraine by May 10? Rich's answers to these questions would eventually be evaluated by the intelligence community, but she didn't feel a lot of pressure. This wasn't her full-time gig.
ELAINE RICH: I'm just - I'm just a pharmacist. I mean nobody cares about me, nobody knows my name, I don't have a professional reputation at stake. And, you know, it's this anonymity that actually gives me freedom to make true forecasts.
SPIEGEL: And Rich does make true forecasts. She is curiously good at predicting future world events. See, for the last three years, Rich and 3,000 other average people like her have been quietly making probability estimates about everything from Venezuelan gas subsidies to Middle Eastern politics as part of the Good Judgment Project, an experiment put together by three well-known psychologists and some people inside the intelligence community.
Rich says that when she first heard about the experiment, it immediately appealed to her.
RICH: I'm always reading the news so this was a way to make the news count.
SPIEGEL: And did you think, oh, I would be good as a forecaster?
RICH: No.
SPIEGEL: At the time, Rich didn't know a lot about international affairs, and she hadn't taken much math in school.
RICH: Girls don't do that stuff, you know, especially, you know, when I was young.
SPIEGEL: But she signed up, got a little training in how to assign probabilities from the people running the program and then was given access to a website which listed dozens of carefully-worded questions of interest to the intelligence community, along with a place for her to enter her numerical estimate of their likelihood.
RICH: And the first two years I did this, all you do is choose numbers. You don't have to say anything about what you're thinking, you don't have to justify your numbers. You just choose numbers and then you see how your numbers work.
SPIEGEL: And for the last three years, Rich's numbers have worked well. She's now in the top one percent of the 3,000 forecasters, which means she has been classified as a superforecaster, someone who is extremely accurate when predicting stuff like: Will there be a significant attack on Israeli territory before May 10, 2014? In fact, Rich is so good that she's been put on a special team with other superforecasters whose team predictions are reportedly 30 percent better than intelligence officers with access to actual classified information.
So I mean, like, do you go to obscure Internet sources or are you just using, like, Wikipedia to make your judgments?
RICH: Usually I just do a Google search.
SPIEGEL: Your basic process is a Google search.
RICH: Yes.
SPIEGEL: Which at least for me raises this question: How is it possible that a group of average citizens doing Google searches in their suburban kitchens can outpredict members of the United States intelligence service with access to classified information? How can that be?
PHILIP TETLOCK: You know, I think everybody's been surprised by these outcomes.
SPIEGEL: Philip Tetlock is one of the psychologists who, with Barbara Mellers and Don Moore, created the Good Judgment Project. Now, for most of his career, Tetlock has studied the problems associated with expert judgment. And all that research brought him to at least two important conclusions.
First, if you want people to get better at making predictions, keep score of how accurate their predictions turn out to be, so that they have feedback. But also, if you take a large crowd of different people with access to different information and pool their predictions, you will be in much better shape than if you rely on a single very smart person, or even a small group of very smart people.
TETLOCK: The wisdom of crowds is a very important part of this project, and it's an important driver of accuracy.
SPIEGEL: The wisdom of crowds was a concept first discovered by the British statistician Francis Galton in 1906. Galton was at a fair where 800 people had tried to guess the weight of a dead ox in a competition. And after the prize got awarded, Galton collected all their guesses so he could figure out how far off the mark the average guess was.
Now, most of the guesses were really bad, way too high, way too low. But when Galton averaged them together, he was shocked. The dead ox weighed 1,198 pounds. The crowd's average: 1,197. Their guess was one pound off. Again, Philip Tetlock.
TETLOCK: There's a lot of noise, there's a lot of statistical random variation, but it's random variation around a signal, a true signal, and when you add all of the random variation on each side of the true signal together, you get closer to the true signal.
SPIEGEL: That is the wisdom of the crowd.
TETLOCK: Does that make sense?
SPIEGEL: So the point of the Good Judgment Project was to figure out if what was true of the dead ox is true for world events as well. And it is. In fact, Tetlock and his team have even engineered ways to significantly improve the wisdom of the crowd, all of which greatly surprised Jason Matheny, one of the people in the intelligence community who got this experiment started.
JASON MATHENY: They've shown that you can significantly improve the accuracy of geopolitical forecasts compared to methods that had been the state of the art before this project started.
SPIEGEL: What is so challenging about all of this, of course, is the idea that you can get very accurate predictions of geopolitical events even without access to secret information, and that access to classified information doesn't automatically and necessarily give you an edge over a smart group of average citizens doing Google searches from their kitchen tables.
I mean, do you think that this way of predicting world events could replace what is accomplished in the intelligence community?
MATHENY: No, I don't think so. I think it's a complement to existing methods rather than a substitute.
SPIEGEL: Matheny says that though Good Judgment predictions have been extremely accurate on the questions they've been asked so far, it's not clear that this process will work in every situation.
MATHENY: There are likely to be other types of questions for which open source information isn't going to be enough.
SPIEGEL: In any case, in the next couple of weeks the Good Judgment Project will begin recruiting more forecasters, and Elaine Rich, the suburban Maryland pharmacist, thinks more people like her, average citizens, should give it a shot. Alix Spiegel, NPR News, Washington.
Copyright © 2014 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.
NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.