Axon Considers Combining Body Cameras With Facial Recognition Axon, formerly Taser, has created a new ethics board to consider using artificial intelligence and facial recognition in local policing.
NPR logo

Body Camera Maker Weighs Adding Facial Recognition Technology

  • Download
  • <iframe src="https://www.npr.org/player/embed/610632088/610632089" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Body Camera Maker Weighs Adding Facial Recognition Technology

Body Camera Maker Weighs Adding Facial Recognition Technology

  • Download
  • <iframe src="https://www.npr.org/player/embed/610632088/610632089" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

SCOTT SIMON, HOST:

Axon, formerly known as TASER International, makes tasers and body cameras for police departments. And in the near future, body cams may be equipped with facial recognition software. The company has created a new ethics board to consider some of the implications of this software and the emerging use of artificial intelligence in local policing. Axon CEO Rick Smith joins us now from their offices in Scottsdale, Ariz. Thanks so much for being with us.

RICK SMITH: Thanks for having me on today.

SIMON: As I don't have to tell you, Mr. Smith, you announced your ethics board, and a group of more than 40 civil rights and tech groups wrote you a letter in which they said you should never develop real-time facial recognition through police body cameras because they say the risks of misidentification are too high. Innocent people could be pursued by the police - sometimes suffer fatal consequences. How do you respond to that?

SMITH: Well, we agree philosophically with the issues that were raised. But I think it's counterproductive to say that a technology is unethical and should never be developed. I think what we need to do is take a look at how this technology could evolve. What are the risks? But basically today, an individual officer might have to make life-or-death decisions based only on their own perceptions and prejudices. Do we think that computers getting information to those officers that could them make better decisions would move the world in the right direction? And I think the answer is unequivocally, yes, that could happen.

SIMON: As the technology stands now - as we've heard reported any number of times, the technology's especially faulty when it comes to seeing the differences in darker faces.

SMITH: I think that has to do with the types of training data sets that have been used historically. Certainly those are one of the issues that before we developed anything to be deployed in the field, we would take a very hard look at that.

SIMON: I gather Chinese police, for example, routinely use facial recognition technology. Some of them even have sunglasses that come equipped with cameras that can identify faces in real time. They say it's allowed them to arrest suspected criminals. And in, you know, China criminals can include people who just believe in free speech. Do you have reservations about using that technology here?

SMITH: Well, you know, for example, there are police forces around the world that use batons and guns in very abusive ways. And yet ultimately, we know that our police, in order to do their job, need to have those same types of tools. We understand that these technologies could be used in ways that we don't want to see happening in our society. However, I think it's too blunt an instrument to say that because there is a risk of misuse, we should just write them off. I think we need to dig a layer deeper and understand what are the benefits and what are the risks.

SIMON: What are the benefits in your mind?

SMITH: Well, I mean, you could imagine many benefits. I think one example you can look at is DNA technology. You know, when DNA was first being introduced, there was much concern about false positives and false matches. And yet ultimately, I think DNA technology has done more than any other key technology in exonerating people that were wrongfully convicted. I think we'll see other biometrics, including facial recognition technology, that properly deployed with the right oversight over the coming decades could ultimately reduce prejudice in policing and help catch dangerous people that we all agree we don't want out in our communities and do it in a way that, at the same time, respects police transparency and rights of privacy of the average citizen.

SIMON: Maybe this is generational, but, Mr. Smith, how do you feel about the fact that we might soon have a technology that - well, when you leave the office today, it'll recognize you and know when you get into the elevator. It will recognize you when you're in the parking lot. It will recognize you when you stop at a stoplight on your way home and know where you are all the time.

SMITH: Well, it's certainly an interesting world that we're moving into where notions of privacy are changing pretty dramatically. And what's most interesting is I think people are actually opting into these systems. Knowingly and willingly, they're deploying these types of technologies for the convenience that they offer to themselves. And then that opens questions of what does privacy mean in the world we live in today. And frankly, what's it going to mean in another 10 or 20 years?

SIMON: Rick Smith, CEO of Axon, formerly known as TASER International. Thanks so much for being with us, sir.

SMITH: Great. And thank you for the thoughtful questions.

Copyright © 2018 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.