GUY RAZ, HOST:
So on the show today, we've been asking, Can We Trust The Numbers? And, of course, there are many, many examples when numbers are crucial. When Anne Milgram was just starting out as the attorney general of New Jersey back in 2007...
ANNE MILGRAM: I was, you know, 35 years old at the time and really eager to, you know, sort of work on high-level criminal justice issues.
RAZ: She turned her attention to not only one of the deadliest cities in her state but also in the country - Camden, N.J.
MILGRAM: It's a very unique power, but the New Jersey AG is the chief law enforcement officer. And so the AG can take over any police department, any case, any prosecutor's office. And, of course, on day one, I was then, as attorney general, in charge of the most dangerous city in America.
RAZ: Wow. So you - when you became attorney general you also became the police chief, basically, for Camden.
MILGRAM: Yes, exactly. And I went to this COMSTAT meeting where the senior police commanders come in the room and they are grilled by the leadership of the department on trends in crime, on police responses to crime. And so I went to see this in Camden. And, you know, there was a guy there with a pad of yellow stickies and a black Sharpie marker. The first officer said we had a, you know, robbery last week on such and such a street corner, no suspects. And the guy wrote it on the yellow sticky and put it on the wall.
And on the wall was a map of Camden. And so we were sort of loosely mapping where crime was. And then the next senior officer said, you know, we had a homicide, no suspects. And this went on for about two hours. And at the end, we had a wall filled with yellow stickies but virtually no idea of where the next crime would happen or how we could reduce crime.
MILGRAM: I saw for the first time that the systemic failure is that without data and without information, a system that's run really subjectively based on our gut and our instinct, we don't know what we're doing. We don't know whether we're doing it well. And we don't know whether or not we can do it better.
RAZ: So to radically change things, Anne drew inspiration from an unlikely place, the book and later the movie "Moneyball."
(SOUNDBITE OF FILM, "MONEYBALL")
JONAH HILL: (As Peter Brand) There is an epidemic failure within the game to understand what is really happening.
RAZ: Anne decided to "Moneyball" the criminal justice system in Camden, N.J., using data and statistics.
(SOUNDBITE OF FILM, "MONEYBALL")
HILL: (As Peter Brand) And this leads people who run Major League Baseball teams to misjudge their players and mismanage their teams.
MILGRAM: When you think about baseball scouts, what they did solely back in the day was they would use their instinct and their judgment. And they would go out and basically say, OK, this guy's going to be great. This guy's not going to be great. And then you have "Moneyball," where you've got the Oakland A's coming in and basically saying, look, when we do the statistical analysis, we find out that what really matters is on-base percentage. That's how teams score runs. That's how teams win. And so in criminal justice, the system is remarkably similar. It is a very subjective system where the police officer, the prosecutor, the judges are making decisions based on their experience and instinct.
RAZ: So you decide to do this in New Jersey, in criminal justice. How? What did you do?
MILGRAM: We did a number of things. We pulled data to understand where the police officers were being deployed to understand where the crimes were happening. We started to look at who the next shooters were, at - what data and information we had. We put GPS on the police car, so at every moment, we could understand which car was closest to the 911 call.
We literally pulled every piece of information that we had access to in the department. And we started to gather new forms of data and information as well and to build systems around that to track patterns and changes in crime. So we would know when a neighborhood was heating up, when it was cooling down and where we should start thinking about - we should be spending our time and our effort.
RAZ: Here's more from Anne Milgram on the TED stage.
(SOUNDBITE OF TED TALK)
MILGRAM: It worked for the Oakland A's, and it worked in the state of New Jersey. We took Camden off the top of the list as the most-dangerous city in America. We reduced murders there by 41 percent, which actually means 37 lives were saved. And we reduced all crime in the city by 26 percent. We also changed the way we did criminal prosecutions. So we went from doing low-level drug crimes that were outside our building to doing cases of statewide importance on things like reducing violence with the most-violent offenders, prosecuting street gangs, gun and drug trafficking and political corruption.
RAZ: After Anne finished serving her time as attorney general, she went to work for a foundation to tackle bail reform. And she used the same kind of "Moneyball" approach that she used in Camden.
(SOUNDBITE OF TED TALK)
MILGRAM: We have 12 million arrests every single year. Less than 5 percent of all arrests are for violent crime, yet we spend $75 billion - that's B for billion - dollars a year on state and local corrections costs. Right now today we have 2.3 million people in our jails and prisons. And we face unbelievable public safety challenges because we have a situation in which two thirds of the people in our jails are there waiting for trial. They're just waiting for their day in court. And 67 percent of people come back. Our recidivism rate is amongst the highest in the world. The reason for this is the way we make decisions. Judges have the best intentions when they make these decisions about risk, but they're making them subjectively. What we need in this space are strong data and analytics. So I went out and built a phenomenal team of data scientists and researchers and statisticians to build a universal risk assessment tool so that every single judge in the United States of America can have an objective, scientific measure of risk. And the tool that we've built, what we did was we collected 1.5 million cases from all around the United States. And we found that there were nine specific things that mattered all across the country and that were the most highly predictive of risk, things like the defendant's prior convictions, whether they'd been sentenced to incarceration, whether they've engaged in violence before, whether they've even failed to come back to court. And with this tool we can predict three things. First, whether or not someone will commit a new crime if they're released. Second, for the first time, and I think this is incredibly important, we can predict whether someone will commit an act of violence if they're released. And that's the single most important thing that judges say when you talk to them. And third, we can predict whether someone will come back to court. And every single judge in the United States of America can use it because it's been created on a universal data set.
(SOUNDBITE OF MUSIC)
RAZ: So Anne, we've been talking about the "Moneyball" side of this, and "Moneyball's" great. Everyone loves "Moneyball." But we haven't talked about the "Minority Report" report side of this, which is a little bit darker. Because earlier in the show, we were talking about algorithmic bias, and Joy Buolamwini pointed out, you know, the more sinister side of algorithms, especially in predictive policing. Like, you could say, for example, that, you know, in most American cities, low-level crimes might involve a young man between the ages of 17 and 25, probably a young man of color because men of color are more likely to be arrested than white men doing the same crimes. And you could give that kind of statistic in theory to a police department, and they could say, OK, we've got to focus our efforts on policing 17 to 25-year-old men of color. And that opens up a whole other set of problems.
MILGRAM: Yeah. So here's how I would think about it, and I'll come back to the data bias in a moment, but I think the first question is, can we accept the current system as it is? And I would argue that the answer is no. You know, we have the highest rate of incarceration in the world. We spend $280 billion a year. We have 70 or 80 million Americans who now have criminal arrest records. We are not as smart about how we use resources in a cost-effective way, and we are not as fair or equitable. The second point is that we really don't know a lot of what we're doing. This lack of information is simply unacceptable to me. It's really hard for me to understand how criminal justice could be a space where we just leave it to gut and instinct. Now, the bias piece, I am someone who believes that virtually all data is biased, that there is bias in data. The question to me is not whether or not we should use data in criminal justice, it's how we should use data. The standard for technology and for data is not perfection, it's, is it better? It's, can we make an improvement upon our current system?
RAZ: I mean, can we get to a place quickly where we can gather enough data, enough statistics to actually create a more just criminal justice system, a full 360 system that actually treats everybody equitably?
MILGRAM: We have to. We need to start pulling the data. We need to start understanding what's happening, and we need to start thinking about this. But we have to understand what's happening, who's in our system, what are our outcomes and how do we actually make the public safer? How do we reduce crime? And I think, you know, we're not doing a very good job right now unless we start embracing data and thinking about how and when we use it.
(SOUNDBITE OF MUSIC)
RAZ: Anne Milgram. She's the former attorney general of New Jersey and now a professor at the NYU School of Law. You can see her full talk at ted.com.
(SOUNDBITE OF SONG, "NUMBERS DON'T LIE")
THE MYNABIRDS: (Singing) Baby, if you want to be right, I will let you be right. I will let you be right. You know that the numbers don't lie. Oh, no, the numbers don't lie. Two wrongs will not make it right.
RAZ: Hey, thanks for listening to our show, Can We Trust The Numbers, this week. If you want to find out more about who was on it, go to ted.npr.org. To see hundreds more TED Talks, check out ted.com or the TED app. And you can listen to this show anytime by subscribing to our podcast. You can do it right now on Apple Podcasts or however you get your podcasts.
Our production staff at NPR includes Jeff Rogers, Sanaz Meshkinpour, Jinae West, Neva Grant, Rund Abdelfatah, Casey Herman and Rachel Faulkner, with help from Daniel Shukin and Benjamin Klempay. Our intern is Diba Mohtasham. Our partners at TED are Chris Anderson, Colin Helms, Anna Phelan and Janet Lee. If you want to let us know what you think about the show, you can write us at firstname.lastname@example.org. You can tweet us. It's @TEDRadioHour. I'm Guy Raz, and you've been listening to ideas worth spreading, right here on the TED Radio Hour from NPR.
(SOUNDBITE OF SONG, "NUMBERS DON'T LIE")
THE MYNABIRDS: (Singing) Oh, no, the numbers don't lie.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.