Hands-Free, Mind-Free: What We Lose Through Automation : All Tech ConsideredRobert Siegel is joined by author Nicholas Carr for a look at the future of automation and automobiles. Carr's new book, The Glass Cage, warns against the rise of automation in our lives.
As Siegel was driving in rush-hour traffic to Carr's hotel in Washington, D.C., this system kept him in his lane — had he strayed, it would have taken over the steering — and it maintained his distance from a taxi that cut in front.
"The car ahead of me is moving, so the car is following it. I'm not accelerating right now," Siegel says.
"That's correct. And you're not braking," Minielly adds.
"I'm not braking. And now the car ahead of me is slowing down so this car is slowing down," Siegel says.
No hands, no feet. The Mercedes was driving itself.
For Carr, features like automatic navigation demonstrate how technology gives to human beings, while also taking away.
"At least you used to have to figure out where you were," Carr says. "And even with a paper map, you'd have to locate yourself somewhere and figure out what the landmarks around you are and kind of get a sense of place. And that's no longer necessary when you have the voice come on and say, 'In 500 yards turn left, 200 yards turn right.' I do think there's something lost there."
And it's not just behind the wheel of a vehicle that can drive itself where Carr sees a worrying loss of autonomy.
"Well, you see it in a lot of professions," including doctors and pilots, he says.
When you go into your doctor's office today, Carr says, the doctor spends a lot of time entering data into a computer — information he used to dictate or write down — and going through different templates to help give him hints on diagnosis.
"Can be good, can be not so good, but [it] changes the doctor-patient relationship in very interesting ways," Carr says.
The same goes for flying. Flight has become much safer since the inception of autopilot and more automated systems, Carr says.
"The name of the book, The Glass Cage, refers to what pilots call the glass cockpit — that more and more they're flying by looking at banks of computer monitors," he says.
As a result of autopilot, though, pilots aren't getting enough practice in manual flying. So when something bad happens, pilots are rusty and often make mistakes.
"I think the lesson isn't that automation is bad, but we have to be very wise in knowing how to automate and when to say no; let's not take more control away from the human being," Carr says.
But a lot of automated driving features work to avoid accidents.
Virginia Tech and the Virginia Department of Transportation have placed short-range radio transmitters on selected roads and highways in the state, says Ray Resendes, executive director of the Virginia Tech Transportation Institute's National Capital Region.
Resendes' Cadillac receives their transmissions and displays them on the dashboard navigation screen. Recommended speeds are shown on exit ramps, and as the car passes a school, an alert warns that school children are nearby.
Carr says he worries about people being bombarded with such automatic alerts — in our future cars and everywhere else.
And here's another question that Carr writes about: If the car of the future will make decisions for us, how will it decide what to do when a collision is unavoidable and a computer is in charge of the steering?
"You have to start programming difficult moral, ethical decisions into the car," Carr says. "If you are gonna crash into something, what do you crash into? Do you go off the road and crash into a telephone pole rather than hitting a pedestrian?"
Resendes says addressing these issues will be a very difficult task. "A lot of times people will steer to avoid a rear end collision and then you can run into a head on collision, which is the worst crash, or you can hit a pedestrian," he says. "Having to codify these issues into the algorithm on a vehicle is a very serious issue."
Carr's complaint against intrusive automation isn't just about how well or how poorly computers might make moral decisions for us. It's about the very erosion of human autonomy.
"Once we start taking our moral thinking and moral decision-making away from us and putting it into the hands not of a machine really, but of the programmers of that machine, then I think we're starting to give up something essential to what it means to be a human being," he says.