Copyright ©2008 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

ALEX CHADWICK, host:

This is Day to Day. I'm Alex Chadwick.

MADELEINE BRAND, host:

I'm Madeleine Brand.

Two monkeys were able to use their thoughts to control a robotic arm. The animals fed themselves with nuanced, fluid motions. The hope is that this technology can be used eventually by people who are paralyzed. This research is out in this week's journal "Nature." Andrew Schwartz at the University of Pittsburgh School of Medicine is the lead author on the paper, and he's here now. And describe how you carried it out. You had a monkey and he was - how did you attach the arm? And what was he reaching for?

Professor ANDREW SCHWARTZ (University of Pittsburgh School of Medicine): So, the monkey's seated right next to the robot arm and the robot arm's suspended so that the shoulder of the robot arm is right next to the monkey's own shoulder. And the monkey just has his arms gently restrained in some tubes, and the only way that the monkey can reach for this food is by controlling the robot arm to do it for him.

BRAND: You had a treat that the monkey was interested in?

Prof. SCHWARTZ: Right, we'd hold fruit or marshmallows, zucchini in front of the animal.

BRAND: Zucchini? (Laughs)

Prof. SCHWARTZ: And then he'd have to reach out with his arm, which was a pretty natural looking mechanical arm, and it had a little gripper at the end. And the animal had to reach for the food and close the gripper around it and then lift it off a peg and bring it back to his mouth.

BRAND: Tell us how you conducted this research? How did you get the monkey to use the robotic arm?

Prof. SCHWARTZ: We record brain signals and then take those brain signals and feed them to a computer, and the computer interprets the signals and generates commands to a robot arm.

BRAND: So, you actually have little implanted devices in the brain that pick up these signals and translate them into the robotic arm?

Prof. SCHWARTZ: Right. We have a little array, a little microchip that's about half the size of a thumbtack that's in the cortex of these monkeys.

BRAND: And what is surprising about this? Because this kind of research has been done in the past. What was surprising with your findings?

Prof. SCHWARTZ: Well, the general idea is that we are recording from this really complex of neurons. Yet we are able to get a really simplistic signal that is very robust. So, it's reliable and easy to understand.

BRAND: So, the monkey was pretty easily able to figure out what to do? How to move the arm?

Prof. SCHWARTZ: Yes, and I think the reason for that is because this is the natural kind of activity that takes place when the animal really uses these signals to control its own arm. It works really well and the monkey can learn it in about two or three days.

BRAND: Two or three days?

Prof. SCHWARTZ: That's right, and you know, I think with humans it will be even - just as rapid or even faster.

BRAND: One would hope. What are the barriers to using this for humans?

Prof. SCHWARTZ: Well, one of the problems is that we use these electrodes that are planted permanently in the brain, and the brain tends to treat this as a foreign object and form scar tissue around this little tiny electrodes. They are about the size of a human hair. And so when the scar tissue forms around them, then our signal degrades and we have a lot tougher time trying to understand what those signals mean.

BRAND: So, how long would the electrodes last?

Prof. SCHWARTZ: Well, anywhere from months to years. So, we've had some monkeys go for six or seven years. The monkey that is used in this study is still going and that's about over a year and half so far. The signals degrade, but we can still use some aspects of those signals for the control.

BRAND: When do you anticipate it being used in humans?

Prof. SCHWARTZ: Well, on a research basis, we expect to be able to do this in the next year or two.

BRAND: So, what's next for you and your research?

Prof. SCHWARTZ: In terms of developing this application, this prosthetic, it will be adding hand and fingers and along with that we are going to need to add a sense of touch to this because, as you grip something, you need to have a tactile sensation to really control a hand well.

BRAND: So then the signals would be going back to the brain.

Prof. SCHWARTZ: That's right. They'll be going back to the brain, and we hope to activate sensory cortex to give a sense of touch back to these subjects.

BRAND: Well, thank you very much.

Prof. SCHWARTZ: Well, thank you.

BRAND: That's Andrew Schwartz. He's a professor of neurobiology at the University of Pittsburgh School of Medicine.

Copyright © 2008 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.