ROBERT SIEGEL, HOST:
Some police departments are trying to predict the future. They're using software that lays out statistics from the past to project where crime is moving. Police in Los Angeles say it has worked well in predicting property crimes. Now, Seattle is about to expand it to gun violence. But as we hear from NPR's Martin Kaste, police are hesitant about relying too much on computers.
MARTIN KASTE, BYLINE: This all started as a research project. An anthropologist at UCLA, Jeff Brantingham, wanted to see if computers could model future crime, the same way they model earthquake aftershocks. Turns out they can.
DR. JEFFREY BRANTINGHAM: It predicts sort of twice as much crime as any other existing system, even going head-to-head with a crime analyst.
KASTE: Older systems, like the famous CompStat in New York, show where crime has been. This looks forward.
BRANTINGHAM: The model will actually predict other locations that effectively say, even though there was a crime somewhere else in your environment, the risk is still greatest in this location today in the next 10 hours or the next 12 hours.
KASTE: Brantingham and his colleagues are now selling this predictive system to police departments. They call their product PredPol. And at this point, you're probably already thinking about the sci-fi movie "Minority Report." But this is different. There are no psychics sleeping in bathtubs, for one. But more to the point, this isn't about predicting the who of future crime, just the where.
SERGEANT CHRISTI ROBBIN: These red boxes are predictions of where the next crimes are likely to occur.
KASTE: Police Sergeant Christi Robbin pinches and zooms on a map of Seattle. Earlier this year, the city started using PredPol to predict property crimes. Now, it's expanding the system to predict gun violence too; the first place to do so. At the start of every shift, patrol cops are assigned to the boxes on the map.
ROBBIN: So we're asking that they spend the time in that 500-by-500-square-foot box, doing whatever proactive work they can to prevent that crime.
PHILIP MONZON: ...next block down and just - so this is the spot to park. We're going to park here.
KASTE: Officer Philip Monzon has just pulled up inside his box. Today, it's a city block near the Seattle waterfront.
MONZON: They want visibility. They want contacts with businesses as are appropriate, and anybody who's wandering through the area. So here we go.
KASTE: This area has a lot of parking lots, and PredPol's forecast includes car thefts. As Monzon passes a green Honda, he pauses. The guy inside seems to be ducking under the dashboard.
MONZON: I just want to make sure if he has a key or if he's going to pull out anytime soon.
KASTE: The car starts, the guy probably does have the key. But why didn't Officer Monzon challenge him, just in case?
MONZON: I don't really have enough - I'm not going to just single out one guy in a Honda.
KASTE: And here's where this gets tricky. The courts say police need reasonable suspicion in order to stop somebody. That suspicion can come from a lot of things, even someone's furtive movements, as the police like to say. But can it come from the fact that someone is occupying an imaginary red box drawn by a computer?
MONZON: No, no. I don't know. I wouldn't make a stop solely on that.
KASTE: That's probably the right answer, says Andrew Guthrie Ferguson. He's a law professor at the University of the District of Columbia, and he's taken a special interest in the constitutional implications of PredPol. He says the departments using it have told police not to use it as a basis for stops. But he wonders how long that can last.
ANDREW GUTHRIE FERGUSON: The idea that you wouldn't use something that is actually part of the officer's suspicion and not put that in may come to a head when that officer is testifying and either is going to have to omit a fact that really was the reason he stopped or she stopped the suspect or is something that they will then admit on the stand and then the issue will be raised for the court to address.
KASTE: And it may be that PredPol is a constitutional basis for stopping someone. Some might see it as more objective than a cop's judgment, less prone to racism or other kinds of profiling. Ferguson says that is possible, but we need to be careful.
FERGUSON: I think most people are going to defer to the black box, which means we need to focus on what's going into that black box, how accurate it is, and what transparency and accountability measures we add to it.
KASTE: In other words, even though computers aren't biased, the stats feeding them might be. And he says if we're going to follow an algorithm, we should at least be willing to check the math. Martin Kaste, NPR News, Seattle.
(SOUNDBITE OF MUSIC)
SIEGEL: This is ALL THINGS CONSIDERED from NPR News.