STEVE INSKEEP, HOST:
Many of us would like to think of ourselves as good people, but where do feelings of right and wrong come from? Harvard psychologist Joshua Green says we actually have two competing areas of the brain that push us to reach very different moral conclusions.
JOSHUA GREENE: These different regions produce these different responses; and then they converge in a kind of battle zone in the front of the brain, right behind the forehead.
INSKEEP: So let's get a report from the battle zone. NPR science correspondent Shankar Vedantam brings us a report about new research about this moral battleground.
SHANKAR VEDANTAM, BYLINE: When we think about morality, many of us think about religion, or what our parents taught us when we were young. Those influences are powerful, but many scientists now think of the brain as a more basic source for our moral instincts. One of the tools scientists use to study how the brain makes moral decisions, are stories - like this one, read by Joshua Greene. That's the Harvard psychologist you just heard from a second ago.
GREENE: A trolley is headed towards five people, and the only way you can save them is to hit a switch that will turn the trolley away from the five and onto a side track. But if you turn it onto the side track, it will run over one person.
VEDANTAM: It's a moral dilemma. Greene and other researchers have presented this dilemma to research volunteers. Most people say they'd flip the switch and divert the trolley. They say they don't want to kill someone, but one innocent person dead is better than five innocent people dead. What this shows is that people resolve the moral dilemma by doing a sort of cost-benefit analysis. Greene says they look at the consequences of each choice, and pick the choice that does the least harm. In other words, people are what philosophers would call utilitarians - except, Greene tells me, sometimes they aren't.
GREENE: This time, you're on a footbridge, in between the oncoming trolley and the five people. And next to you is a big person wearing a big backpack. And the only way you can save those five people is to push this big guy off of the footbridge so that he lands on the tracks. And he'll get squashed by the train. You sort of use him as a trolley stopper. But you can save the five people.
VEDANTAM: Would you push the big guy to his death? More important, do you feel this moral dilemma is identical to the earlier one?
GREENE: In a certain sense, they're identical - trade one life to save five - but psychologically, they're very different.
VEDANTAM: Pushing someone to their death feels very different from pushing a switch. When Greene gives people this dilemma, most people don't choose to push the big guy to his death. In other words, people use utilitarian, cost-benefit calculations sometimes. But other times, they make an emotional decision.
GREENE: There are certain lines that are drawn in the moral sand. Some things are inherently wrong, or some things inherently must be done.
VEDANTAM: There's another dimension here that's interesting. If you watched yourself during the first dilemma, you may have noticed you had to think about whether you'd push that switch. In the footbridge dilemma, you probably didn't have to think. You just knew that pushing someone to their death is wrong.
Greene says we really have two, completely different moral circuits in our brain. When you listen to a dilemma, the two circuits literally have a fight inside your brain. Part of your brain says slow down, think rationally, make a cost-benefit analysis. Another says no. Don't think about it. This is just wrong.
GREENE: These responses compete in a part of the brain called the ventromedial prefrontal cortex, which is a kind of place where different types of values can be weighed against each other, to produce an all-things-considered decision.
VEDANTAM: So what makes the ventromedial prefrontal cortex go with the rational mode sometimes, and the emotional mode other times? Greene and a colleague, Elinor Amit, thought closely about what was happening to people as they tipped from rational mode to an emotional mode. In new research they've just published in the journal "Psychological Science," these psychologists say they have the answer.
GREENE: Emotional responses don't just pop out of nowhere. They have to be triggered by something. And one possibility is that you hear the words describing some event; you picture that event in your mind; and then you respond emotionally to that picture.
VEDANTAM: That's the key. Some dilemmas produce vivid images in our head, and we're wired to respond emotionally to pictures. Take away the pictures; the brain goes into rational calculation mode.
Here's how they found that out: Greene and Amit set up an experiment. They presented people with moral dilemmas that evoked strong visual images. As expected, the volunteers made emotional moral judgments.
Then the psychologists made it difficult for the volunteers to visualize the dilemma. They distracted them by making them visualize something else instead. When that happened, the volunteers stopped making emotional decisions. Not having pictures of the moral dilemma in their head, prompted them into rational, cost-benefit mode.
In another experiment, Greene and Amit also found that people who think visually make more emotional moral judgments. Verbal people make more rational calculations. Amit says people don't realize how images tip the brain one way or another. And that can create biases we aren't even aware of. She asked me to imagine a scenario.
ELINOR AMIT: Imagine a horrible scenario in which a terrorist takes an ax and starts slaughtering people in a bus. I'm coming from Israel, so these are the examples that I have in my mind.
VEDANTAM: As she talks, I can see a movie unfold in my head. There's blood everywhere. I can hear people screaming. I don't have to think at all; it feels terribly wrong. Then Amit asked me to think about another news event - a drone strike that sends a missile hurtling toward a target; at the center of the crosshairs, an explosion. There's dust billowing everywhere.
AMIT: So if you learn about these events from television, or from pictures in the newspaper, which one you judge as more horrible - the person with the ax that killed maybe two people, but the scene looks horrible and extremely violent; or the picture of the drone that maybe killed 100 people, but looks relatively clean and nice?
VEDANTAM: To be sure, the events Amit describes are completely different. One's a terrorist attack. The other is a military action. But it's true, the ax murderer instantly sends my brain into emotional mode. The drone strike has less vivid imagery. I can't see, up close, what the missile does. So I go into utilitarian mode. I start to think about the costs and benefits.
Amit's point is not that one mode is better than the other. It's something much more disturbing. As you listen to the news every day, hidden circuits in your brain are literally changing the ground rules by which you judge events. You think you're making consistent moral choices when really, the movies playing in your head might be making your choices for you.
Shankar Vedantam, NPR News, Washington.
(SOUNDBITE OF MUSIC)
INSKEEP: You can hear Shankar regularly on this program, talking about social science research. And you can also follow him on social media. He's @HiddenBrain on Twitter. And, of course, you can also follow this program. Many of us are on Twitter. Among other places, you can find us @MORNINGEDITION, and you can find me @NPRInskeep.
(SOUNDBITE OF MUSIC)
INSKEEP: This is NPR News.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.