NPR logo

It Ain't Me, Babe: Researchers Find Flaws In Police Facial Recognition Technology

  • Download
  • <iframe src="https://www.npr.org/player/embed/499176469/499262869" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
It Ain't Me, Babe: Researchers Find Flaws In Police Facial Recognition Technology

Privacy & Security

It Ain't Me, Babe: Researchers Find Flaws In Police Facial Recognition Technology

It Ain't Me, Babe: Researchers Find Flaws In Police Facial Recognition Technology

  • Download
  • <iframe src="https://www.npr.org/player/embed/499176469/499262869" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Stephen Lamm, a supervisor with the ID fraud unit of the North Carolina Department of Motor Vehicles, looks through photos in a facial recognition system in 2009 in Raleigh, N.C. Gerry Broome/AP hide caption

toggle caption
Gerry Broome/AP

Stephen Lamm, a supervisor with the ID fraud unit of the North Carolina Department of Motor Vehicles, looks through photos in a facial recognition system in 2009 in Raleigh, N.C.

Gerry Broome/AP

Nearly half of all American adults have been entered into law enforcement facial recognition databases, according to a recent report from Georgetown University's law school. But there are many problems with the accuracy of the technology that could have an impact on a lot of innocent people.

There's a good chance your driver's license photo is in one of these databases. The report from the school's Center on Privacy & Technology says more than 117 million adults' photos are stored in them. Facial recognition can be used, for instance, when investigators have a picture of a suspect and they don't have a name.

They can run the photo through a facial recognition program to see if it matches any of the license photos. It's kind of like a very large digital version of a lineup, says Jonathan Frankle, a computer scientist and one of the authors of the report, titled "The Perpetual Line-Up."

"Instead of having a lineup of five people who've been brought in off the street to do this, the lineup is you. You're in that lineup all the time," he says. Frankle says the photos that police may have of a suspect aren't always that good — they're often from a security camera.

"Security cameras tend to be mounted on the ceiling," he says. "They get great views of the top of your head, not very great views of your face. And you can now imagine why this would be a very difficult task, why it's hard to get an accurate read on anybody's face and match them with their driver's license photo."

Frankle says the study also found evidence that facial recognition software didn't work as well with people who have dark skin. There's still limited research on why this is. Some critics say the developers aren't testing the software against a diverse enough group of faces. Or it could be lighting.

"Darker skin has less color contrast. And these algorithms rely on being able to pick out little patterns and color to be able to tell people apart," Frankle says.

Because of its flaws, facial recognition technology does bring a lot of innocent people to the attention of law enforcement.

Patrick Grother says most people have a few doppelgangers out there. He's a computer scientist with the National Institute of Standards and Technology — part of the Commerce Department. "The larger you go, the greater the chance of a false positive," he says. "Inevitably if you look at a billion people you will find somebody that looks quite similar."

And even with the photos taken at the Department of Motor Vehicles, there can be differences in how they are shot. Grother thinks if those photos are going to be used for facial recognition, more uniform standards in lighting, height and focus are needed. "Without those things, without those technical specifications, then face recognition can be undermined," he says.

A New York Police Department security camera set up along a street in New York City on Aug. 26. Robert Alexander/Getty Images hide caption

toggle caption
Robert Alexander/Getty Images

A New York Police Department security camera set up along a street in New York City on Aug. 26.

Robert Alexander/Getty Images

And yet facial recognition software is sophisticated enough to be useful in critical situations. Anil Jain, a computer science professor at Michigan State University, did an experiment after the Tsarnaev brothers, who committed the Boston Marathon bombings, were caught. He wanted to see if facial recognition technology could have helped police name them sooner. Police had photos of them from a security camera. Jain ran those photos against a database of a million driver's licenses. The software found 10 matches for the younger brother.

"We were able to locate him in the top 10 candidates," Jain says. "But the older brother we couldn't locate, and the reason was he was wearing the dark glasses."

Of course it did identify nine people who were not guilty.

In a statement responding to the Georgetown study, the FBI says it uses facial recognition only as an investigative lead, not for positive identification.

The Georgetown authors aren't saying that this technology should never be used — only that lawmakers need to create standards; otherwise, it can be misused and harm innocent people.

'The Perpetual Line-Up'

A study by the Center on Privacy & Technology at Georgetown Law examined police facial recognition technology. Here are some of the findings and recommendations.

Findings

  • Law enforcement face recognition networks include over 117 million American adults — and may soon include many more.
  • By running face recognition searches against 16 states' driver's license photo databases, the FBI has built a biometric network that primarily includes law-abiding Americans.
  • Major police departments are exploring real-time face recognition on live surveillance camera video.
  • Law enforcement face recognition is unregulated.
  • Police face recognition could be used to stifle free speech.
  • Most law enforcement agencies do little to ensure that their systems are accurate.
  • Without specialized training, human users make the wrong decision about a match half the time.
  • Police face recognition will disproportionately affect African-Americans.

Recommendations

  • Law enforcement face recognition searches should be conditioned on an individualized suspicion of criminal conduct.
  • Mug shot databases used for face recognition should exclude people who were found innocent or who had charges against them dropped or dismissed.
  • Searches of driver's license and ID photos should occur only under a court order issued upon a showing of probable cause.
  • Limit searches of license photos — and after-the-fact investigative searches — to investigations of serious offenses.
  • Real-time video surveillance should only occur in life-threatening public emergencies under a court order backed by probable cause.
  • Use of face recognition to track people on the basis of their race, ethnicity, religious or political views should be prohibited.
  • The FBI should test its face recognition system for accuracy and racially biased error rates, and make the results public.

Source: The Perpetual Line-Up