STEVE INSKEEP, host:
It's MORNING EDITION from NPR News. I'm Steve Inskeep.
If your ears cannot quite hear what someone is saying, you might want to try listening with your skin. New research shows that sensations on the skin can help you understand speech. NPR's Jon Hamilton has the story, so listen up.
JON HAMILTON: It's been clear for a long time that hearing involves more than just our ears. Bryan Gick is a professor of phonetics at the University of British Columbia.
Professor BRYAN GICK (Phonetics, University of British Columbia): From our brain's point of view, we can hear with our eyes.
HAMILTON: Gick says that's why people in a noisy room are more likely to understand someone if they can see the speaker's lips. But Gick wanted to see whether hearing could also be influenced by our sense of touch. So, he had a graduate student round up some volunteers. They listened to a person making speech sounds - pa, ba, ta, da. Meanwhile, the graduate student stood by with a turkey baster.
Prof. GICK: She would squirt a little puff of air from the turkey baster onto the neck of the perceiver. And people had very strong responses to it.
HAMILTON: Specifically, people did a much better job recognizing certain consonants when they were accompanied by a puff of air. That's not as odd as you might think. The sounds in question were pa and ta. To make those sounds the mouth sends out a little burst of air. In contrast, the sounds ba and da do not come with a burst. And the brain seems to know that.
Gick confirmed this with a high tech version of the turkey baster experiment. He used an automated device to deliver tiny precisely timed puffs of air to a person's hand or neck while they listened to audio recordings with a lot of background noise.
Unidentified Man: Pa, ba, ta, da.
Prof. GICK: What we found was that the little puff of air would cause them to be more likely to report having heard a sound like pa, even if what had actually been played to them was ba.
HAMILTON: Gick says what's really interesting about this, is that people don't normally use their skin to understand speech, yet they did it automatically. Gick says that suggests our brains are wired to incorporate information from other senses when we listen.
Prof. GICK: From my point of view, we're whole-body perceiving machines, and we just take all of the information that comes at us in our environment and merge it into a percept of something that happened in the world.
HAMILTON: That's an idea that's becoming widely accepted among brain scientists. David Ostry is a professor of psychology at McGill University in Montreal. A few months ago he did a different experiment that involved touch and hearing.
Professor DAVID OSTRY (Psychology, McGill University): We put small plastic tabs on the side of the cheeks, and we had a robot basically pull on the facial skin.
HAMILTON: People did a better job recognizing a sound if the robot tugged their facial skin in the same direction it would have moved if they'd been making the sound themselves. Ostry says these days the big scientific question isn't whether our brains routinely integrate sensory information, but how.
Prof. OSTRY: Really it's up for grabs where within the brain this kind of integration is happening.
HAMILTON: Ostry says it could be happening in areas that process sensory information or in the motor cortex, which controls our muscles.
Research on how hearing is influenced by other senses could help people with hearing loss or people like airline pilots who need to understand speech in noisy environments. The new study appears in the journal Nature.
Jon Hamilton, NPR News.
NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.