NPR logo

Humans Worry About Self-Driving Cars. Maybe It Should Be The Reverse

  • Download
  • <iframe src="https://www.npr.org/player/embed/509086451/509086461" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Humans Worry About Self-Driving Cars. Maybe It Should Be The Reverse

Humans Worry About Self-Driving Cars. Maybe It Should Be The Reverse

Humans Worry About Self-Driving Cars. Maybe It Should Be The Reverse

  • Download
  • <iframe src="https://www.npr.org/player/embed/509086451/509086461" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Instead of cars terrorizing people, one researcher is asking whether people might be terrorizing self-driving cars. Noah Berger/AFP/Getty Images hide caption

toggle caption
Noah Berger/AFP/Getty Images

Instead of cars terrorizing people, one researcher is asking whether people might be terrorizing self-driving cars.

Noah Berger/AFP/Getty Images

Self-driving cars will perform rationally. For example: stop when someone is in their way. Research suggests humans will take advantage, and step into an intersection when they know they shouldn't.

RACHEL MARTIN, HOST:

I don't know about you, but when I've seen these images of self-driving cars in TV ads or in some other way, it's a little bit disconcerting, I mean, the idea that this thing is just out there making life-and-death decisions at every turn. So apparently there's some new social science research that is examining something that's a little counterintuitive. Instead of cars terrorizing people, one researcher is asking whether people might be terrorizing self-driving cars. I need some help explaining this, so our own NPR Shankar Vedantam is here to help me. Hi, Shankar.

SHANKAR VEDANTAM, BYLINE: Hi, Rachel.

MARTIN: What does this mean, people terrorizing self-driving cars?

VEDANTAM: Well, let me set this up for you. I was talking with Adam Millard-Ball - he's a professor of environmental studies at the University of California at Santa Cruz - and he's modeled what's going to happen when self-driving cars start showing up on the road.

MARTIN: OK.

VEDANTAM: He told me that, at its core, driving is not just about the physics of moving objects and the law of who can do what on the road. It's also about psychology. People have developed really complex and often unspoken rules of how to interact with one another on the road. You can teach a self-driving car all the rules and give it the tools to navigate around obstacles, but can these cars deal with all the psychological games that human drivers and pedestrians play on the roads?

MARTIN: I mean, I know that there's psychological warfare on the road for sure, but tell me how that applies to this situation.

VEDANTAM: So when we think of all the technologies that self-driving cars need, we often make a big assumption and that big assumption is that rational behavior is always the right course of action on the road. Millard-Ball told me he once took a taxicab ride in New York. Unlike the standard issue New York cab driver, this one actually drove his car rationally, deliberately and followed all the rules of the road. In other words, he drove like a self-driving car.

ADAM MILLARD-BALL: He didn't try and cut in. He'd yielded to pedestrians and cyclists when he should, and in a journey (ph) that two or three times as long to get across Manhattan as it would have done otherwise.

MARTIN: (Laughter) But it was a safer experience. And we note that because that feels exceptional - right? - because you can get in a cab and think that your life's on the line sometimes.

VEDANTAM: Exactly. Now, no one in their right mind would program a self-driving car to behave irrationally. But when you think about it, a good part of driving today involves the unspoken assumption that other drivers may not always behave rationally. When it comes to interacting with self-driving cars, humans will know that the robot who's operating the car will always do the rational thing.

MILLARD-BALL: A self-driving car is not going to be drunk. It's not going to look down to check it's phone. It's not going to be distracted, and it's not going to be sociopathic in that it's not going to kind of dare a pedestrian to walk out in front of it.

MARTIN: And so is that going to make us, the pedestrians, or just the other drivers, less cautious?

VEDANTAM: I think so. I mean, if you know the other car is always going to stop, even if you are in the wrong, you now have a psychological incentive to play a game of chicken because in the game of chicken, crazy beats sane when you're playing chicken. Or think of pedestrians. You know, today I know that if I step into a busy intersection, some crazy driver who is texting his girlfriend is going to hit me, so I do the rational thing and I stay on the sidewalk. But if I was certain the car is always going to stop, and presumably self-driving cars will be programmed to stop, shouldn't we expect lots of pedestrians to boldly step in front of cars?

MARTIN: I'm going to drive the bus.

VEDANTAM: (Laughter).

MARTIN: I don't know if that's going to make me safer or not. Shankar Vedantam, NPR's social science correspondent, he's also the host of a podcast that explores the unseen patterns in human behavior. It's called Hidden Brain. Thanks, Shankar.

VEDANTAM: Thank you, Rachel.

Copyright © 2017 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.