Understanding Unconscious Bias
MADDIE SOFIA, BYLINE: You're listening to SHORT WAVE from NPR.
EMILY KWONG, HOST:
The human brain is a marvelous sponge that can process 11 million bits of information every second. But like a sponge, it's leaky. Our conscious minds, the thoughts we are aware of, can only handle 40 to 50 bits of information a second, which means that way more is entering our heads than we realize.
PRAGYA AGARWAL: There's so much information coming at us. We can't really process all that information on a very rational, logical manner. Otherwise, we would be agonizing over every decision we make.
(SOUNDBITE OF MUSIC)
KWONG: Pragya Agarwal is a behavioral and data scientist in the U.K. and looks at this in her new book.
(SOUNDBITE OF MUSIC)
KWONG: So what's the human brain to do? Well, Pragya says we sometimes take cognitive shortcuts to help make those decisions easier, shortcuts that can lead to implicit bias or, as it's sometimes called, unconscious bias, which is what her book "Sway" is all about.
AGARWAL: These are some of the biases or prejudices that we carry within us. And we might think that we are really fair-minded and egalitarian, but they often spring up on us when we least expect it, often when we are tired or distracted or in a hurry.
KWONG: Including, of course, racial bias. Pragya gave me a short example from her own life. Raised in India, she came to the U.K. over 20 years ago and now lives with her husband and three kids in a beautiful seaside town.
AGARWAL: The sandy beach is, like, 10 minutes away, which is great for the dog. But it is not a very multicultural place.
KWONG: At home, running into others' bias is kind of an everyday experience.
AGARWAL: Especially since Brexit happened. And I'm in the supermarket, and a white woman recently told me, oh, this is not how we do things here.
KWONG: Remember; Pragya has been living in the U.K. for over 20 years. Bias really is all around.
AGARWAL: Yes, exactly.
KWONG: I think I just inadvertently quoted "Love Actually," which is an export of the U.K.
KWONG: Isn't there that line?
AGARWAL: Yes. How many times have you watched "Love Actually" (laughter)?
KWONG: Anyway, today on the show, Pragya Agarwal on what science has to say about unconscious bias, where it comes from and how we can check our unconscious biases in the moment. I'm Emily Kwong, and this is SHORT WAVE, the daily science podcast from NPR.
(SOUNDBITE OF MUSIC)
KWONG: OK. So today, we're speaking with behavioral scientist Pragya Agarwal about unconscious bias, and racial bias in particular. And she explains that what's tricky about bias is that our brains are kind of built for it. You have your amygdala.
AGARWAL: Our amygdala responds to fear and threat.
KWONG: It's the center of emotional processing associated with these quick, strong reactions, like fear and anxiety. Now, our brains have this mental process called System 1 that is in charge of making any unconscious, almost spontaneous decision. You know it as your gut instinct, but really, it's your amygdala talking.
AGARWAL: We know that in System 1 processing, when we are taking these decisions in a very rushed manner, it's only our kind of amygdalas responding to it because we're matching the information to preexisting stereotypes and templates.
KWONG: Templates and stereotypes, some of them having to do with race, which we pick up over time from the environment around us - our family, our school, our community, the media. System 1 will associate what we're seeing with these stereotypes and make quick decisions based on those preexisting ideas.
AGARWAL: We can be prey to our biases without realizing them.
KWONG: Now, some biases are more complex and tap into other parts of the brain, but the basic point is that the process our brains use to create bias is innate, even if the stereotypes and assumptions we create are not. So I wondered, why did our brains evolve this way? And Pragya said there are some evolutionary theories.
AGARWAL: There's this notion that our distant ancestors of the distant past had to make very quick decisions about who was part of the group or not because they were competing for very limited resources. And, of course, there was more threat of diseases because they hadn't built immunity to certain diseases and illnesses. They were less sure of who would bring these diseases or viruses. So these kind of primal instincts of forming these in-group and out-group, a sense of comfort or a sense of safety is there, has been there. But we cannot use them to excuse our behaviors now, I believe, because we're not in the same kind of situations. We are not competing for the same kind of resources. We are not - don't have the same kind of threat or fear.
KWONG: So while the learning process is hard-wired, what we teach our brains is up to us. And as Pragya sees it, for society to address structural racism, these policies and practices that uphold racial inequity, individuals must address their racial bias, too. In fact, these two forces kind of feed off of one another.
AGARWAL: Unconscious bias cannot be used to excuse systemic inequalities. It's kind of a cycle where systemic and structural biases kind of lead to some of these individual and interpersonal racism and racist beliefs and prejudices, but they also reinforce the systemic and structural. So it's kind of a cycle that we have to talk about.
KWONG: But you can't break a cycle you don't understand to begin with. The idea of unconscious bias was first defined by psychologists in 1995, and there's been headway to measure bias and show how it works in the world. Pragya wrote about many of these experiments in her book.
One experiment that was really striking to me was the 2007 study from the University of Colorado Boulder and the University of Chicago. So these are research teams that looked at the role of stereotypes in officer-involved shootings.
AGARWAL: Yes. So I think these are really powerful examples because even before we were talking about the recent atrocities of police brutality, these were showing that a lot of these decisions that are made in kind of an instinctive way or in situations where quick decisions have to be made, police officers can be falling back on these generalized assumptions or stereotypes that they carry.
So in this experiment, for instance, there was a virtual simulation platform where a number of people, men, were walking around with a tool or something in their hands which was hidden, and the participant had to make a decision about whether it was just an innocuous or a harmless tool or a gun. And then they had to, within a limited time, press a button called Shoot or Don't Shoot. And they had to make this decision based on their assessment of what they were carrying whether they were a threat.
KWONG: And the racial bias of the test-takers, which included police officers, showed. They were faster to shoot a Black male target with a gun than a white male target with a gun.
AGARWAL: And even when they were not carrying a gun, the police officer was likely to assume that they were carrying a gun.
KWONG: Mind you, this was a video simulation, but the study was successful in measuring racial bias in real time.
AGARWAL: And I think these results are sometimes terrifying, actually.
KWONG: Yeah. And these biases - you write about them existing in other parts of society - racial biases beyond policing. You write about the impacts in medicine and in law and in law enforcement. It just seems like racial bias is everywhere.
AGARWAL: Yes, I think so. I think that we need to have a really serious conversation around this because we see that it has a huge impact in life-and-death situations. In medical and health care, we know that Black women are more likely to die during pregnancy or they have higher mortality rate - maternity mortality rate. And the research has shown that there's a huge mental, as well as physical, impact. It causes stress and anxiety. And carrying this trauma affects the body in a number of ways.
We know about, obviously, housing discrimination. We have so many case studies about name discrimination - foreign-sounding names. We have so many studies now of this discrimination also seeping into technology - facial recognition, the cameras that we use, self-driving cars. All these things have racial biases inbuilt in them which affect people's lives.
KWONG: Yeah. I wanted to ask you, too, just about - it affects people's lives, and these very same people, you and me included, have bias. We're both impacted - people are both impacted by bias, and they are acting upon their bias, which leads me to wonder why is it so hard for us to recognize our own biases in the moment?
AGARWAL: Because I think we all want to believe that we are fair-minded and that we are not contributing to these societal inequities in any way. We don't want to acknowledge the privileges that we have. And I think writing this book was quite triggering in a number of ways because I had to reflect on my own biases as I was writing that.
I talk about biases, anti-Blackness within the South Asian community because I think recently, when we're talking about Black Lives Matter and racial biases against - prejudice against Black people, I, as part of a South Asian community, have a responsibility to talk about colorism and anti-Blackness in South Asian community and how the whole model minority thing can make people perpetuate and enable some of these racial prejudices in society as well, both in the U.S. and the U.K.
So I think it's our responsibility for each of us to acknowledge these generalized assumptions, these stereotypes that we carry. And often, what happens is that we might carry these stereotypes, and we might not act on any of these stereotypes. And so we also have to understand the triggers that bridge this gap between holding these stereotypes and activating these stereotypes as well. Yeah, I think it's really important for us to acknowledge that - all of us.
KWONG: OK. You end Chapter 8 of your book by saying, quote, "as the nature of racism and its form of articulation changes, being quiet and nonracist isn't enough anymore. Silence breeds prejudice. The only way to combat racism is to be actively anti-racist." So how do we check our biases in the moment?
AGARWAL: I don't think any amount of training or testing can cure us of unconscious biases or implicit - these shortcuts that we employ, but I think it has to be an ongoing process. We are constantly educating ourselves and constantly being aware of that. We are exposing ourselves to diverse views and diverse perspectives. We're actively stepping out of our echo chambers and listening to dissenting voices and dissenting views. We are checking the validity of any information that we get and not just acting or - on impulses. And then decisions do matter a lot, and then we need to take time with these decisions.
We can also check our assumptions and, rather than employing generalized assumptions like all men act like that or all women like that, in that moment, we can say, this man is acting like this, or, this person is acting like this, rather than using the word all. And I think as we start doing that, we can start breaking or dismantling some of these kind of assumptions that we carry about groups of people.
(SOUNDBITE OF MUSIC)
KWONG: This episode was produced by Rebecca Ramirez, edited by Viet Le and fact-checked by Berly McCoy. I'm Emily Kwong with a reminder to wash your hands and check your bias. You've been listening to SHORT WAVE from NPR.
(SOUNDBITE OF MUSIC)
NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.