Social Robots Raise Moral, Ethical Questions

MIT professor Sherry Turkle talks to Ari Shapiro about the arrival of human companion robots. What are the limits and dangers of projecting human qualities onto social robots? Turkle is the author of Alone Together: Why We Expect More From Technology and Less from Each Other.

Copyright © 2011 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

ARI SHAPIRO, host:

Astronauts on the International Space Station will soon unpack the first humanoid robot in space. Robonaut, or R2, has two arms and two hands, each with five fingers. It will work alongside real astronauts on the Space Station. This comes as robots become a bigger part of our daily lives than ever. In some nursing homes, robotic babies are providing comfort to the elderly.

To talk more about these developments, we called MIT professor Sherry Turkle. Shes author of "Alone Together: Why We Expect More from Technology and Less from Each Other."

Welcome.

Professor SHERRY TURKLE (Author; Alone Together: Why We Expect More from Technology and Less from Each Other"): My pleasure to be here.

SHAPIRO: How does this robot in space fit into whats happening with robots in our daily lives?

Prof. TURKLE: Well, there are two kinds of ways robots are entering into our daily lives. The first one is instrumental and this robot in space is that kind. It works to help us in chores. We have Roomba vacuum cleaners that are helping us clean our rugs. We are going to have robots that are going to assist the elderly and reaching for things on high shelves. These are assistive robots.

The second kind of robot that is starting to be introduced are robots that serve as human companions, as eldercare bots, as nanny bots, and these robots, I think, raise significant moral and ethical questions that we need to discuss.

SHAPIRO: Before we get to the moral and ethical questions, I want to ask about the distinction, because this robot in space may be purely for the function of assisting the astronauts. But the fact that it's shaped like a human - that is even writing charming messages on Twitter, suggests that it is not exclusively functional, it is also meant to be somehow appealing to humans.

Prof. TURKLE: Well, absolutely. I mean I think that this line is kind of a thin one. The difference is, when this robot tweets; it's not pretending to have a personality. But the robots that I'm concerned about are proposing themselves to substitute for human beings in these more intimate roles.

SHAPIRO: You know, it's funny, when I hear you express your concerns about robots, I think back to what the traditional stereotypical fear of the robot is, which I think is best captured in the movie "2001: A Space Odyssey."

(Soundbite of movie, "2001: A Space Odyssey")

Mr. KEIR DULLEA (Actor): (Dr. Dave Bowman): Open the pod bay doors, Hal.

Mr. DOUGLAS RAIN (Actor): (as Hal 9000): I'm sorry, Dave. I'm afraid I can't do that.

SHAPIRO: Sherry Turkle, the typical fear of the robot is that it will attack, overpower and defy us. Your fear is that it will entice us.

Prof. TURKLE: Well, absolutely. The issue now for us is more captured in a movie like "WALL-E," where its the robots who teach the people how to love. In my research in which I interviewed hundreds of people about robots, what comes up over and over again is peoples disappointments and people. And their fantasies about robots is that, somehow, these robots will be more human than the disappointing humans around them.

I interviewed one woman who says, you know, I could imagine having a robot boyfriend. My boyfriend is disappointing. I just want the feeling of civility in the house and I could see a robot just providing that.

SHAPIRO: If we're not tall enough to reach a high shelf we have no problem with a robot reaching the shelf for us.

Prof. TURKLE: Mm-hmm.

SHAPIRO: But if were not loving enough to be a good spouse we have a real problem with a robot filling that need.

Prof. TURKLE: I have a real problem. I'm unabashedly species chauvinistic because a robot can pretend when it says I love you, but it doesn't know what it is to have desire or sexual life, appetites. It doesn't have the ark of the human life.

SHAPIRO: But the people you mentioned in your research say whether it has those things or not it does a better job than the people who do.

Prof. TURKLE: Yes. People are starting to say that the simulation of feeling, the simulation of love, I might take that. I see that movement in the interviews that I do and I think it - it made me want to put that out there for people to talk about.

SHAPIRO: MIT Professor Sherry Turkle is author of "Alone Together: Why We Expect More from Technology and Less from Each Other".

Thanks for talking with us.

Prof. TURKLE: My pleasure.

Copyright © 2011 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.

Support comes from: