NPR logo

Hands-Free, Mind-Free: What We Lose Through Automation

  • Download
  • <iframe src="https://www.npr.org/player/embed/352496605/352538448" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Hands-Free, Mind-Free: What We Lose Through Automation

Hands-Free, Mind-Free: What We Lose Through Automation

Hands-Free, Mind-Free: What We Lose Through Automation

  • Download
  • <iframe src="https://www.npr.org/player/embed/352496605/352538448" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

NPR's Robert Siegel and Michael Minielly, a Mercedes-Benz representative, drive a new S550 4Matic, which allows for semi-autonomous driving. Rob Ballenger /NPR hide caption

toggle caption
Rob Ballenger /NPR

NPR's Robert Siegel and Michael Minielly, a Mercedes-Benz representative, drive a new S550 4Matic, which allows for semi-autonomous driving.

Rob Ballenger /NPR

Nicholas Carr's books are the nagging, tech-wary conscience of the digital age. In The Shallows, he warned that surfing the Internet is destroying our attention span.

Now in his new book, The Glass Cage, Carr warns us that computers are making more and more decisions for us, and we risk forgetting how to make those decisions ourselves.

He writes a lot about cars. Cars that do many things for us automatically, things we used to do and had to think about. And cars of the future that may take over the driving from us altogether.

NPR's Robert Siegel picked him up in a state-of-the-art driving machine, a 2014 Mercedes-Benz S550 4Matic.

The car is Mercedes' top of the line, highest-tech model you can drive. It parks itself. It controls the windshield wipers. And it automatically dims the high beams when oncoming cars approach.

It's even equipped with a special camera and radar that allows for semi-autonomous driving, says Michael Minielly of Mercedes-Benz USA, who was also along for the ride.

The Glass Cage

Automation and Us

by Nicholas Carr

Hardcover, 288 pages |

purchase

Buy Featured Book

Title
The Glass Cage
Subtitle
Automation and Us
Author
Nicholas Carr

Your purchase helps support NPR programming. How?

As Siegel was driving in rush-hour traffic to Carr's hotel in Washington, D.C., this system kept him in his lane — had he strayed, it would have taken over the steering — and it maintained his distance from a taxi that cut in front.

"The car ahead of me is moving, so the car is following it. I'm not accelerating right now," Siegel says.

"That's correct. And you're not braking," Minielly adds.

"I'm not braking. And now the car ahead of me is slowing down so this car is slowing down," Siegel says.

No hands, no feet. The Mercedes was driving itself.

For Carr, features like automatic navigation demonstrate how technology gives to human beings, while also taking away.

"At least you used to have to figure out where you were," Carr says. "And even with a paper map, you'd have to locate yourself somewhere and figure out what the landmarks around you are and kind of get a sense of place. And that's no longer necessary when you have the voice come on and say, 'In 500 yards turn left, 200 yards turn right.' I do think there's something lost there."

And it's not just behind the wheel of a vehicle that can drive itself where Carr sees a worrying loss of autonomy.

"Well, you see it in a lot of professions," including doctors and pilots, he says.

When you go into your doctor's office today, Carr says, the doctor spends a lot of time entering data into a computer — information he used to dictate or write down — and going through different templates to help give him hints on diagnosis.

"Can be good, can be not so good, but [it] changes the doctor-patient relationship in very interesting ways," Carr says.

The same goes for flying. Flight has become much safer since the inception of autopilot and more automated systems, Carr says.

"The name of the book, The Glass Cage, refers to what pilots call the glass cockpit — that more and more they're flying by looking at banks of computer monitors," he says.

As a result of autopilot, though, pilots aren't getting enough practice in manual flying. So when something bad happens, pilots are rusty and often make mistakes.

"I think the lesson isn't that automation is bad, but we have to be very wise in knowing how to automate and when to say no; let's not take more control away from the human being," Carr says.

But a lot of automated driving features work to avoid accidents.

Virginia Tech and the Virginia Department of Transportation have placed short-range radio transmitters on selected roads and highways in the state, says Ray Resendes, executive director of the Virginia Tech Transportation Institute's National Capital Region.

Resendes' Cadillac receives their transmissions and displays them on the dashboard navigation screen. Recommended speeds are shown on exit ramps, and as the car passes a school, an alert warns that school children are nearby.

Carr says he worries about people being bombarded with such automatic alerts — in our future cars and everywhere else.

"This is something called 'alert fatigue,' " he says. When you receive too many alerts, "... you actually become less alert yourself because you're just dismissing the alerts."

And here's another question that Carr writes about: If the car of the future will make decisions for us, how will it decide what to do when a collision is unavoidable and a computer is in charge of the steering?

"You have to start programming difficult moral, ethical decisions into the car," Carr says. "If you are gonna crash into something, what do you crash into? Do you go off the road and crash into a telephone pole rather than hitting a pedestrian?"

Resendes says addressing these issues will be a very difficult task. "A lot of times people will steer to avoid a rear end collision and then you can run into a head on collision, which is the worst crash, or you can hit a pedestrian," he says. "Having to codify these issues into the algorithm on a vehicle is a very serious issue."

Carr's complaint against intrusive automation isn't just about how well or how poorly computers might make moral decisions for us. It's about the very erosion of human autonomy.

"Once we start taking our moral thinking and moral decision-making away from us and putting it into the hands not of a machine really, but of the programmers of that machine, then I think we're starting to give up something essential to what it means to be a human being," he says.