GUY RAZ, HOST:
So many people - most people of goodwill - will try not to be biased, or they think that they're not biased, right? But, you know, it's almost like there's something in our wiring - right? - as human beings - like we just heard Yassmin say - that make us biased.
MARSHALL SHEPHERD: Oh, absolutely. I often think about marinades. We marinate our food to give it flavoring. I think that we are products of our own personal marinades in terms of our biases and how they evolve.
RAZ: This is Marshall Shepherd.
SHEPHERD: If you think about your personal marinade - your parents, your faith-based upbringing, your cultural or geographic marinades - all of those flavor who we become and what biases we have.
RAZ: And Marshall says our beliefs and assumptions really do shape the information we seek out and share, especially scientific information. And he sees this all the time because Marshall is a climatologist, which also explains why he gets these two questions a lot.
(SOUNDBITE OF TED TALK)
SHEPHERD: Do you believe in climate change? Do you believe in global warming?
RAZ: Here's more from Marshall Shepherd on the TED stage.
(SOUNDBITE OF TED TALK)
SHEPHERD: I have to gather myself every time I get that question because it's an ill-posed question. Science isn't a belief system. My son - he's 10. He believes in the tooth fairy.
SHEPHERD: Consider this. You never hear anyone say, do you believe if you go to the top of that building and throw a ball off, it's going to fall? You never hear that because gravity is a thing.
SHEPHERD: So why don't we hear the question, do you believe in gravity? But, of course, we hear the question, do you believe in global warming? Eighty-seven percent of scientists believe that humans are contributing to climate change but only 50 percent of the public. How do we get there? So it begs the question, what shapes perceptions about science? I think that one thing that shapes perceptions in the public about science is belief systems and biases. Belief systems and biases - go with me for a moment because I want to talk about three elements of that - confirmation bias, Dunning-Kruger effect and cognitive dissonance.
(SOUNDBITE OF MUSIC)
RAZ: So let's talk about confirmation bias. What - like, how does confirmation bias work?
SHEPHERD: It's simply this notion that we seek out information that already confirms what we already think or believe. And you see that in the choices that people make about what radio or television station or news network they decide to listen to or what books they read or what websites they choose to visit.
(SOUNDBITE OF TED TALK)
SHEPHERD: I'm on Twitter. And often, when it snows, I'll get this tweet back to me.
SHEPHERD: Hey, Dr. Shepherd. I have 20 inches of global warming in my yard. What are you guys talking about climate change? It's a cute tweet. It makes me chuckle as well. But it's oh so fundamentally scientifically flawed because it illustrates that the person tweeting doesn't understand the difference between weather and climate. I often say weather is your mood. Climate is your personality. Your mood today doesn't necessarily tell me anything about your personality nor does a cold day tell me anything about climate change - or a hot day, for that matter.
(SOUNDBITE OF MUSIC)
SHEPHERD: You know, those are opportunities to increase science or climate literacy in those regards. They're cute, and they're funny. But it really is steeped in, you know, what we're talking about today. Someone that typically says that; one, doesn't understand the difference between weather and climate. But it also is an inherent bias that they probably have because they don't believe in climate change anyhow.
RAZ: So in your talk, you also mentioned the Dunning-Kruger effect. What is that? How does that work?
SHEPHERD: Dunning and Kruger were two professors at Cornell University - psychology professors, who published a paper several years ago. And they use all these fancy terms, saying the Dunning-Kruger effect is this notion of illusory superiority, blah, blah, blah, blah. In other words, the Dunning-Kruger effect is people think they know more than they do or they underestimate what they don't know about a topic. And we see this all of the time at Thanksgiving dinner or on Twitter. Twitter is a bastion of Dunning-Kruger (laughter)...
SHEPHERD: ...Because you have people giving information, establishing themselves as experts with no basis for that expertise at all.
RAZ: Yeah. And I probably do this, too. I think a lot of...
SHEPHERD: We all do it.
RAZ: Yeah. Right?
SHEPHERD: We all do it, yeah.
RAZ: There are plenty of things that I encounter every day that I sort of weigh in on when, if I really think about it, I don't know as much about it as I think I do.
SHEPHERD: Yeah. No, we all do this. I mean, I experience it quite a bit when it comes to weather forecasting and climate change. It's very pervasive. But, you know, as I step back, I say, look. You know, a couple months ago, I had a leak in my yard. And it turned out that the water line was leaking from the water box to the house. You know, I quickly surveyed the landscape and said, you know what? This is above my pay grade. I'm going to call a plumber (laughter). And so the point in that is, yeah, we all exhibit Dunning-Kruger. But let's kind of get back to this experts being experts on a topic. And that's an interesting conversation in itself because sometimes, I feel like we are in an era where people, actually, are hostile towards experts. They actually say that, oh, because you're an expert, you think you're better than me. No, that's not the case at all. I think we're equal human beings. But I think I know a little bit more about meteorology than you do.
(SOUNDBITE OF MUSIC)
RAZ: So, Marshall, finally, you talk about cognitive dissonance. And cognitive dissonance is like, you know, like, sort of having faith in something that is not necessarily rooted in reality, right?
SHEPHERD: Oh, yeah. To me, one of the most sort of ultimate examples of cognitive dissonance is a fairly educated person that will walk up to me and say, yeah, you know, I think this climate change stuff is a hoax. I don't believe it at all. But oh, by the way, did you see the groundhog forecast today? What do you think about that? So I'd say, well, I think it's a rodent, first of all. You know, the ultimate cognitive dissonance is that you're dismissing science evidence about climate change, but then you're asking me about a rodent's forecast for the spring. I just - that - those are the types of things that are amusing, but I think that's really what we're dealing with.
RAZ: You know, I wonder whether even the context of our conversation could be perceived - unfairly - but could be perceived as a liberal conversation because more liberals believe the science about climate change.
SHEPHERD: Yeah. I just don't see the concept of being objective as a liberal or conservative paradigm. I wrote an article, for example, that showed that on some issues, there are those that see themselves as conservatives that may have certain views about, say, climate change. But if you look at vaccination issues and this notion that we shouldn't vaccinate our kids, you will find that some people that fall in those categories aren't conservative in other aspects of their lives at all. So I talk to conservative groups. I talk to liberal groups, policymakers of all ilk. And when you're talking to people about topics that - where you, from your perspective, aren't coming at it from a political angle, but maybe they're hearing it from a political angle - is you have to just try to connect the value system that resonates with them. So when I talk about climate change to a group of Republicans or conservatives, you know, I'm talking about economic issues, national security - whereas if I'm talking to a different group, I may talk more about the threat to polar bears or disease. So finding the value system at least tries to disarm those inherent biases or that inherent confirmation bias that your audience or listener may be coming from.
RAZ: What happens - what's at stake if we don't work to at least recognize our biases?
SHEPHERD: One of the things that I'm really concerned about, just as a society right now, is the tribalism that I see across the board. If we are not fighting off our biases, we're going to continue to divide. We're going to continue to make good science, information and knowledge the enemy. And that's scary because science, technology and engineering is not the enemy of the people. Virtually everyone listening to me right now is holding a smartphone. Virtually everyone that's listening to me consumed information from a weather forecast today or had some type of medical procedure that benefited from research and development or has a GPS system in their car. That's science. Those are things that make our lives better. But yet, this sort of bias that permeates the discussion about information and science is very dangerous because it is jeopardizing our ability to move society forward.
RAZ: That's Marshall Shepherd. He's director of the atmospheric sciences program at the University of Georgia. You can see his full talk at ted.com.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.