San Francisco DA Looks To AI To Remove Potential Prosecution Bias NPR's Michel Martin speaks with San Francisco district attorney George Gascon about his plan to use artificial intelligence to combat racial bias in criminal sentencing.
NPR logo

San Francisco DA Looks To AI To Remove Potential Prosecution Bias

  • Download
  • <iframe src="https://www.npr.org/player/embed/733081706/733081707" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
San Francisco DA Looks To AI To Remove Potential Prosecution Bias

Law

San Francisco DA Looks To AI To Remove Potential Prosecution Bias

  • Download
  • <iframe src="https://www.npr.org/player/embed/733081706/733081707" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

MICHEL MARTIN, HOST:

Racial bias in the criminal justice system has become a national concern. Activists and even some elected officials have been pressing to find ways to eliminate or reduce disparities in everything from the use of force to sentencing decisions. Now the district attorney of San Francisco has announced a new approach to eliminating racial bias in charging decisions. The DA's office will use technology intended to remove the potential for implicit biases when prosecutors first review a case. We're joined now by San Francisco's District Attorney George Gascon. Welcome. Thanks so much for talking to us.

GEORGE GASCON: Thank you, Michel.

MARTIN: So I understand that the idea for this came from a 2017 study, which was commissioned by the district attorney, which found, quote, "substantial racial and ethnic disparities in criminal justice outcomes." So then, what gave you the idea for this technology tool - for this tool - it was developed, as I understand it, with the help of Stanford's computational policy lab.

GASCON: Right. So when I first became DA in 2011, one of the concerns that I had was I wanted to understand to what level, if any, the work that we were doing in DA's office was compromising the purity of the work by filtering implicit bias into our work. We tried really hard, obviously, not to have explicit bias but implicit is more complicated as you know because it's part of the socialization process. And whatever your behavior is, you're not really seeing it. So we brought in researchers to help us first look at our work in general terms and see whether there was anything that we could do in order to reduce the disproportionality in our work when it came to dealing with racial minorities and to ensure that we were not treating people, disparagingly.

MARTIN: So let me just be clear about this. As I understand it, if you use this technology on a police report, it will redact information about the race of the people involved. It will get rid of the names of the officers, witnesses and suspects. And it eliminates references to locations and neighborhoods that could suggest somebody's race. So how does this work?

GASCON: Right. So what you have is you have, basically, artificial intelligence that has been trained to do a natural language search. First, it will go through all the face sheets of the reports and redact anything that has to do with race, hair color, eye colors; you know, those things that normally would be identified with a particular race; also, names because, sometimes, they can be; and locations where the location is just simply the location where the incident occurred. Then, it goes into the actual narrative and, again, using natural language search, goes through it and removes all the things that could be either racial identifiers or proxies for racial identifiers.

The prosecutor then makes an assessment based on the facts that are being described whether we have a prosecutable case. They'll say yes or no. And then, they go into the second level of review, and that's when you start looking at descriptors. So you may see video. You may see a bunch of other things now that clearly are going to tell you what the race of the person is. But now you made your first decision. Now you look at it. And then, you confirm whether you still are going to go through with your original decision.

MARTIN: So how are your prosecutors responding to this? And I'm also curious because you have - as many people may know, you have a very unique profile. I mean, you are believed to be the first former police chief to become a district attorney. So I'm curious about how your prosecutors are reacting to this and how the police department is reacting to this.

GASCON: Right now, the police - I can't tell you for sure. I can tell you as to my prosecutors, what we did is we took the supervisor and the people involved in our charging decisions, and they actually help us work the system. So the process that we have created actually was a collaboration with people in my office. So they are supportive, and they're intrigued by what it's going to look like.

Again, we want to make sure that there are no negative unintended consequences to this. So we're going to be looking at it very closely. We are working with the Sanford people to make sure that we create the right metrics, you know? The goal here is to do something that will do the right things. And part of that will be learning and seeing if there are any negative unintended consequences that - we want to go back and fix them very quickly.

MARTIN: That is George Gascon. He is the district attorney for the city and county of San Francisco. George Gascon, thank you so much for talking with us.

GASCON: My pleasure, Michel.

Copyright © 2019 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.