Harnessing Thoughts To Control A Computer

Researchers decoded electrical brain signals without implanting electrodes, according to a new study. Instead, Jose L. Contreras-Vidal and colleagues monitored brain activity with EEG sensors placed on the scalp, using those signals to reconstruct hand movement and drive a robot.

Copyright © 2010 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

IRA FLATOW, host:

You're listening to SCIENCE FRIDAY from NPR. I'm Ira Flatow. We've all seen those wacky science fiction movies, right, where someone is practicing mind control? They're wearing what looks like a shower cap, with lots of wires coming out of it. Well, it's fiction no more and not just a Hollywood creation. Try the Journal of Neuroscience.

Here's how it works. When we think, our brains generate electrical signals. So when you move your hand forward, you make one signal pattern, you move it back, you make another. Well, for the first time, researchers are able to monitor and decode these signals without having to implant anything in your brain, no tiny little wire electrodes like they used to have to do that. Instead, they are listening with EEG sensors on the scalp.

And my next guest translated these electrical signals into a three-dimensional hand motion, and now he's working on feeding those signals into a computer to drive a mouse. In fact, you can see his apparatus on our Web site at sciencefriday.com.

Jose Contreras-Vidal is also an author of the new study and is a professor in the department of kinesiology and the graduate program of neuroscience and cognitive science at the University of Maryland in College Park. Welcome to SCIENCE FRIDAY, Professor Vidal.

Professor JOSE CONTRERAS-VIDAL (Professor, Department of Kinesiology, Graduate Program of Neuroscience and Cognitive Science, University of Maryland): Hi, Ira. It's a pleasure to be here.

FLATOW: So you did you have to what breakthrough did you have to do to just be able to put the electrodes on a scalp instead of sticking wirers inside?

Prof. CONTRERAS-VIDAL: Yes, for a long time, having thought that the information content and the amount of noise in EEG signals, which are measured outside the scalp, were a limiting factor for extracting useful information, and we never, you know, thought that possibility.

So what we have shown now is that in fact it's possible to extract useful information from your brain using these sensors that are non-invasive.

FLATOW: Dr. Contreras-Vidal, did you come up with this yourself, or was this something that's taken a while to work out?

Prof. CONTRERAS-VIDAL: No, we started in this area since 2004, doing some preliminary work in terms of the signal, processing the algorithms that we use to clean the signals and to extract the information required.

FLATOW: And your aim is to then just to create devices that if you think about it, it will happen.

Prof. CONTRERAS-VIDAL: Well, that's certainly one application, but the applications go beyond that. For example, if because we now understand better how movement is represented in the brain at the micro-scale afforded by EEG, we can compare that, for example, during motor development in children, or we can compare that with a pattern that might be affected by Parkinson's disease and so on and so forth.

FLATOW: And they have different motions I'm guessing from your work and reading your paper different motions show distinctive signals in the brain.

Prof. CONTRERAS-VIDAL: Well, let me tell you how this works. So we have recorded these the brain activity from many sensors on the head at the same time as we recorded the motion of the hand. Now, what we developed a decoder or an algorithm that basically extracts information from many sensors and integrates that information also in the recent past. And when you combine all those signals, then using our algorithm, it's possible to predict when your hand is going to be in space in the next time.

FLATOW: And you can predict that just from the experience that you have with recording the brain waves.

Prof. CONTRERAS-VIDAL: Exactly.

FLATOW: Can you...

Prof. CONTRERAS-VIDAL: Something that's very interesting is that we found that the contribution, the greatest contribution to the decoding was located over a very important area of the brain for motor control, (unintelligible) sensitive motor area. And this activity was 60 milliseconds ahead of the movement of the hand, which is consistent with the delays from the processing and cortex of the motor command, the transmission of the signal through the spinal cord and then finally reaching the muscle.

FLATOW: Yeah, it took time for the signal to get to the hand from the brain.

Prof. CONTRERAS-VIDAL: Yeah, exactly.

FLATOW: Why the hand? Is that because it has so many signals all the time coming from it, we use it so much? Why choose the hand as an area to study?

Prof. CONTRERAS-VIDAL: Well, the hand is, you know, it's a very important part of the body that we use every day for, you know, reaching, grasping and combing your hair. But also, it's, you know, a part of the body that has been used extensively by in non-invasive and invasive studies where you place electrodes inside your brain. So we wanted to be able to compare our algorithm to the invasive ones.

FLATOW: I mentioned before that one of your aims is to try to get people to think about, to think and move a mouse, a computer mouse, by thinking about it. Is that possible?

Prof. CONTRERAS-VIDAL: Yeah, that's correct. In fact, one of my students, Trent Bradbury(ph), who is the student leader in this paper, we are now doing exactly what you say. We are asking persons to think about moving, and you know, we have recorded signals, and we use those signals to control the computer mouse. And we hope that within the next few months, we are able to demonstrate these applications.

FLATOW: A question from Second Life from Tigerpaws(ph) says: Could this set a new standard when determining if someone is truly brain-dead?

Prof. CONTRERAS-VIDAL: That's...

(Soundbite of laughter)

Prof. CONTRERAS-VIDAL: Let me tell you that one advantage of using EEG is that because we are recording from many parts of the brain and because we know that movement is represented in multiple ways in the brain, so there are multiple representations for movement, then even if one part of the brain has suffered an injury or has been damaged, then we can still extract signals from other parts of the brain.

FLATOW: Interesting.

Prof. CONTRERAS-VIDAL: So I think that's a definite advantage that we have using EEG because we see the whole brain, basically.

FLATOW: Let's go to Jessica(ph) in Augusta, Kansas. Hi, Jessica.

JESSICA (Caller): Hi, thanks for having me on the show.

FLATOW: You're welcome. Go ahead.

JESSICA: I had a question, I guess kind of differentiating your thoughts. I mean, you know, you always have one those experiences where you're sitting on the couch, and thinking man, I could really use a sandwich, and you want a sandwich, but you don't get up and do it. And then there's that time when you decide, okay, I'm going to get up, I'm going to do it. How would you be able to differentiate those different desires using your technology?

FLATOW: Good question.

Prof. CONTRERAS-VIDAL: Yeah, excellent question, and it's in fact something that's very important for the practical implementation of brain-machine interfaces. We want to have multitasking capabilities. So we walk and, you know, reach a drink, a glass of water and call our friend. So you know, there's a lot of information that could contaminate the signal that you are interested in decoding.

Now in our experiment, you know, we asked our subjects or our participants to self-select the targets to aim and also decide when to move their arm. So there were, you know, some aspects of decision-making in terms of selection of the target and also timing, timing the movement, not only the movement itself.

So this study already, you know, demonstrate that it is possible to extra information from hand motion, at least, even in the presence of other type of information, for example decision-making and so on, so forth.

FLATOW: Jessica, thanks.

JESSICA: Uh-huh, thanks.

FLATOW: Bye-bye. Do you ever think about testing this out on people who have disabilities, on people with Lou Gehrig's disease who have trouble? You know, they're still awake, and they're aware. I'm thinking of Stephen Hawking-type people who might be able to move things now with thought.

Prof. CONTRERAS-VIDAL: Yeah, that's our next step. Clearly, an assisted device like this would be very helpful for people with disability, for example, spinal-cord injury or stroke. I think you mentioned that. And yes, so that's in our plans, and we hope to be able to report on that, too.

FLATOW: Are there some people whose signals are easier to decode than other people?

Prof. CONTRERAS-VIDAL: Yeah, that's an interesting question. We in fact found that the decoding accuracy of our method depended on the quality of the movement, okay. The way I'd like to explain this is that you can imagine, you know a Major League baseball player who, you know, performs perhaps hundreds of throws, you know...

FLATOW: A pitcher, right.

Prof. CONTRERAS-VIDAL: Yeah, a pitcher, you know, every day, compared to, you know, to a person just like me or perhaps you that, you know, that we don't get...

FLATOW: You throw your gum away once a day or something.

(Soundbite of laughter)

Prof. CONTRERAS-VIDAL: Right. So that baseball player, you know, has achieved a high level of skill in doing that particular, performing that particular action, and we expect that representation of that movement in the brain to be very well-refined and also, you know, very highly consolidated with respect to the motor program for a naive person.

And so what our results suggest is that the decoding of the movement by the baseball player will be easier, you know...

FLATOW: Mm-hmm.

Prof. CONTRERAS-VIDAL: ...will be with greater accuracy. However, this also indicates that once you connect a device with our algorithm that you could train the person, you know, to regain a level of skill with the interface that will be comfortable to him or her.

FLATOW: Yeah. You sort of practice that action.

Prof. CONTRERAS-VIDAL: Exactly.

FLATOW: Yeah. You think about doing it and you say, yes, a little different, a little more, you got it almost, you know, that's what it's like.

Prof. CONTRERAS-VIDAL: Yeah. I mean, the brain is a highly plastic, you know, structure. And it likes to get better at doing things, so we can benefit, we can profit from the plasticity of the brain to...

FLATOW: Right.

Prof. CONTRERAS-VIDAL: ...try to incorporate the interface, the machine, the assisting device in...

FLATOW: Right.

Prof. CONTRERAS-VIDAL: ...your brain. Yeah.

FLATOW: Well, let me ask then a corollary to that question. What if you have an area of the brain that's damaged, could you train a different area to take over that spot, and...

Prof. CONTRERAS-VIDAL: Right. That's also an excellent question. The so I mentioned before that from previous study has been shown that the representation of movement of the brain are multiple, you know?

FLATOW: Right.

Prof. CONTRERAS-VIDAL: Different areas of the brain can be used in principle to extract this information. So even if you have a lesion, you know, a stroke in primary motor cortex which is associated with the production of movement, you could still use...

FLATOW: Hmm.

Prof. CONTRERAS-VIDAL: ...other areas that carry information about movement as well.

FLATOW: Hmm. 1-800-989-8255. Let's go to Mike(ph) in South Bend. Hi, Mike.

MIKE (Caller): Hey, how's it going?

FLATOW: Hi.

MIKE: Thanks for taking my call.

FLATOW: You're welcome.

MIKE: Was basically just wondering I've been hearing a lot about recording the brain function of certain people and being able to map that out. And let's say you had you took the brain function and the brain activity of an expert guitar player and recorded that and then maybe basically my question is, is there a way that maybe in the future they could take that information and put it into somebody that's never played the guitar before, and all of the sudden they'd be an expert because they've got that recorded function?

FLATOW: Hmm.

Prof. CONTRERAS-VIDAL: Well, yeah, that's an interesting thought. What I suppose could be done is to, you know, to know the knowledge that we have about how this representation for movement look like in terms of brain activity, compare that to, you know, for rehabilitation purposes with, you know, with individuals that doesn't have that skill or have a normal movement, and try to devise a method to restore the pattern of activity. So, you know, in principle if, you know, it may be possible to reshape the pattern of activity using some other...

FLATOW: Mm-hmm.

Prof. CONTRERAS-VIDAL: ...techniques.

FLATOW: You're listening to SCIENCE FRIDAY from NPR. I'm Ira Flatow, here talking with Jose Contreras-Vidal who is author of a new study about brain control of limbs and things. Of course, people are going to want to know, professor, when they're going to be able to see some practical results from this sort of...

Prof. CONTRERAS-VIDAL: Well, yes...

FLATOW: Right. I bet you get that asked every 10 minutes.

(Soundbite of laughter)

Prof. CONTRERAS-VIDAL: Yes. That's correct. And as I said before, we are working hard on that. We hope that within a few months we will be able to demonstrate a brain computer interface that can be driven by thought alone. So (unintelligible)...

FLATOW: So it's in experimental stages right now?

Prof. CONTRERAS-VIDAL: Right. We are - basically we're trying to uncover, you know, what is the best way to...

FLATOW: Mm-hmm.

Prof. CONTRERAS-VIDAL: ...train our decoder algorithm. What is the best way to link the brain signals to the machine?

FLATOW: Right. What about a mind-controlled robot? You know, the whole robot you can control with thinking about it?

Prof. CONTRERAS-VIDAL: So what one application is for example, I'm thinking about below-elbow amputee that would like to restore hand function.

FLATOW: Mm-hmm.

Prof. CONTRERAS-VIDAL: So we are pursuing (unintelligible) that tend to decode the gestures of the hand, for example, when you grasp an object. And so we, in principle, could use those signals to drive a robotic arm or a hand. And that's one of the applications that we are also very interested.

FLATOW: Mm-hmm. And I imagine that there must be other laboratories where this must be a hot topic of research to these days.

Prof. CONTRERAS-VIDAL: Yeah. There are laboratories around the world that and many of them focusing on the invasive methods...

FLATOW: Mm-hmm.

Prof. CONTRERAS-VIDAL: ...but some of them are also using EEG signals and some other non-invasive type of signals. But until now, you know, it hasn't been shown that this is possible to do with EEG in terms of the complexity movement that we demonstrated.

FLATOW: Can you decode other types of feelings that are not related to movement, maybe, you know, sadness, or happy, or depression or anything like that?

Prof. CONTRERAS-VIDAL: We yeah. In principle it's possible to decode other internal states of the brain - as you mention, stress or fear. And together with colleagues here at the University of Maryland, we are exactly looking at that question and, you know, we hope to come with that.

FLATOW: But the big take home from what we're talking about is the technology, right? That you don't have to invasively stick wires in people's brains anymore.

Prof. CONTRERAS-VIDAL: Exactly.

FLATOW: You can now record and decode what these things, like the old things you see in the sci-fi movies, the shower cap with the wires coming out of it, right? Is it more complex...

Prof. CONTRERAS-VIDAL: Yeah. I know that...

FLATOW: ...so how much more complex than that?

(Soundbite of laughter)

Prof. CONTRERAS-VIDAL: Yeah. I think that's the whole message from this work, that it is possible to use a non-invasive method to struck information that can be used to control an unassisted device. And that basically eliminates all the potential risk associated with surgery or with implanting an array of electrodes inside your brain, which makes possible to think about applications in clinical populations at risk like, you know, children and the elderly.

FLATOW: Mm-hmm. Well, I wish you luck and...

Prof. CONTRERAS-VIDAL: Thank you very much.

FLATOW: ...do you need a new kind of a shower cap? Do you have to develop anything new in that technology or is it all there?

Prof. CONTRERAS-VIDAL: Well, there's a different type of caps available and that doesn't seem to be really a critical problem.

FLATOW: Yeah. That's not the limiting factor here.

Prof. CONTRERAS-VIDAL: No, that's not a limiting factor.

FLATOW: Okay, Professor, thanks for taking time to be with us today.

Prof. CONTRERAS-VIDAL: Thank you very much.

FLATOW: You're welcome. Jose Contreras-Vidal is the author of this new study that was out, talking about the EEG sensors on a scalp. And he's professor on the department of kinesiology and the graduate program in neuroscience and cognitive science at the University of Maryland in College Park.

Were going to take a break and change subjects. When we come back, we're going to talk about dinosaurs. So - and there's - there are lot of other animals related to dinosaurs that lived at the same time that you may not even heard about. And we're going to talk about who they were and how the discovery of fossils of one have pushed back the age of dinosaurs, changed the whole -you'll see. We'll talk about that. Stick with us. Well be right back.

Copyright © 2010 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.