Can Computers Be Racist? The Human-Like Bias Of Algorithms As algorithms play a growing role in determining content, critics say the results are often filled with biases. Women see ads for lower paying jobs and African-Americans for cheaper neighborhoods.

Can Computers Be Racist? The Human-Like Bias Of Algorithms

Can Computers Be Racist? The Human-Like Bias Of Algorithms

  • Download
  • <iframe src="https://www.npr.org/player/embed/470427605/470427611" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

As algorithms play a growing role in determining content, critics say the results are often filled with biases. Women see ads for lower paying jobs and African-Americans for cheaper neighborhoods.

ARI SHAPIRO, HOST:

Can computers be racist and sexist? Well, yes, they can, and that's the topic of this weeks' All Tech Considered.

(SOUNDBITE OF MUSIC)

SHAPIRO: It's a case of garbage in, garbage out. People have biases, so when they write computer programs, those programs can have biases, too. NPR's Laura Sydell is at the South by Southwest Interactive conference where tech experts are talking about that problem.

LAURA SYDELL, BYLINE: Jackie Alsina (ph) was at a concert with friends, and he took a bunch of pictures. Later, he loaded them into Google Photos, which stores and automatically organizes images. Its software is able to group together pictures of a particular friend or pictures of dogs together, cats, et cetera. But when it labeled a picture of one of Alsina's friends, it left him speechless. Both he and his friend are African-American.

JACKIE ALSINA: It labeled them as something else. It labeled her as a different species, a creature.

SYDELL: You can't even say it.

ALSINA: Yeah, and I kind of refuse to (laughter) more or less 'cause I don't want to accept - by saying it, I kind of, like, reinforce the idea of it, so I kind of try and pull away from that.

SYDELL: I'm not going to reveal which animal it labeled his friend. It also happened to others with dark skin, and Google has apologized and fixed the problem. But Alsina isn't buying that it's just some weird technical glitch.

ALSINA: And, well, they say, oh, it's a computer. I'm like, OK, yeah, a computer build by whom, a computer designed by whom, a computer trained by whom?

SYDELL: Alsina's conclusion is that there probably weren't any black people on the team that designed Google Photos. Google says it did test the product on employees of different races. Alsina's experience is one of many strange biases that turn up in computer algorithms which sift through data for patterns. Most of us are familiar with suggestion algorithms used by Amazon and Netflix. If you like this movie, you'll probably like that one.

For example, the computer may learn over time, the viewers who like the film "Terminator" also enjoy "Ex machina." But in another context, user feedback can harden societal biases. A couple of years ago, a study at Harvard found that when someone searched in Google for a name normally associated with a person of African-American descent, an ad for a company that finds criminal records was more likely to turn up. The algorithm may have initially done this for both black and white people, but over time, the biases of the people who did the search probably got factored in, says Christian Sandvig, a professor at University of Michigan's School of Information.

CHRISTIAN SANDVIG: Because people tended to click on the ad topic that suggested that that person had been arrested, when the name was African-American, the algorithm learned the racism of the search users and then reinforced it by showing that more often.

SYDELL: Sandvig says there are also studies that show women are more likely to be shown lower-paying jobs than men in online ads. Sorelle Friedler, a professor of computer science at Haverford College, thinks women may reinforce this bias without realizing it.

SORELLE FRIEDLER: It might be that women are truly less likely to click on those ads, and probably, that's because of the long history of women making less than men and so perhaps thinking, oh, that ad isn't really for me; I'm not as likely to get that job.

SYDELL: And so the algorithm determines it should no longer show those ads to women. What worries Friedler and other social scientists is that computer algorithms are being used in a growing number of ways. Pennsylvania's testing the idea of using algorithms to suggest how to sentence a particular criminal. Some companies are using them to narrow the pool of job applicants. And it can be hard to guard against bias, says professor Sandvig.

SANDVIG: The systems are of a sufficient complexity that it is possible to say the algorithm did it. And it's actually true. The algorithm is sufficiently complicated, and it's changing in real time. It's writing its own rules on the basis of data and input that it does do things, and we're often surprised by them.

SYDELL: Yet, Sandvig remains optimistic about fixing biased algorithms and actually using computers to guard against prejudice. There are already apps to keep discrimination out of the hiring process. Sandvig and others say it's important to talk about the problem because if humans don't fix it, computers won't do it themselves. Laura Sydell, NPR News, Austin, Texas.

Copyright © 2016 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.