STEVE INSKEEP, Host:
If your ears cannot quite hear what someone is saying, you might want to try listening with your skin. New research shows that sensations on the skin can help you understand speech. NPR's Jon Hamilton has the story, so listen up.
JON HAMILTON: It's been clear for a long time that hearing involves more than just our ears. Bryan Gick is a professor of phonetics at the University of British Columbia.
BRYAN GICK: From our brain's point of view, we can hear with our eyes.
HAMILTON: Gick says that's why people in a noisy room are more likely to understand someone if they can see the speaker's lips. But Gick wanted to see whether hearing could also be influenced by our sense of touch. So, he had a graduate student round up some volunteers. They listened to a person making speech sounds - pa, ba, ta, da. Meanwhile, the graduate student stood by with a turkey baster.
GICK: She would squirt a little puff of air from the turkey baster onto the neck of the perceiver. And people had very strong responses to it.
HAMILTON: Unidentified Man: Pa, ba, ta, da.
GICK: What we found was that the little puff of air would cause them to be more likely to report having heard a sound like pa, even if what had actually been played to them was ba.
HAMILTON: Gick says what's really interesting about this, is that people don't normally use their skin to understand speech, yet they did it automatically. Gick says that suggests our brains are wired to incorporate information from other senses when we listen.
GICK: From my point of view, we're whole-body perceiving machines, and we just take all of the information that comes at us in our environment and merge it into a percept of something that happened in the world.
HAMILTON: That's an idea that's becoming widely accepted among brain scientists. David Ostry is a professor of psychology at McGill University in Montreal. A few months ago he did a different experiment that involved touch and hearing.
DAVID OSTRY: We put small plastic tabs on the side of the cheeks, and we had a robot basically pull on the facial skin.
HAMILTON: People did a better job recognizing a sound if the robot tugged their facial skin in the same direction it would have moved if they'd been making the sound themselves. Ostry says these days the big scientific question isn't whether our brains routinely integrate sensory information, but how.
OSTRY: Really it's up for grabs where within the brain this kind of integration is happening.
HAMILTON: Jon Hamilton, NPR News.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.