Photo via flickr by Mat_the_W
Humans use their whole bodies, not just their ears, to understand speech, according to University of British Columbia linguistics research.
It is well known that humans naturally process facial expression along with what is being heard to fully understand what is being communicated. The UBC study is the first to show we also naturally process tactile information to perceive sounds of speech.
Prof. Bryan Gick of UBC’s Dept. of Linguistics, along with PhD student Donald Derrick, found that air puffs directed at skin can bias perception of spoken syllables. “This study suggests we are much better at using tactile information than was previously thought,” says Gick, also a member of Haskins Laboratories, an affiliate of Yale University.
The study, published in Nature, offers findings that may be applied to telecommunications, speech science and hearing aid technology.
English speakers use aspiration—the tiny bursts of breath accompanying speech sounds—to distinguish sounds such as “pa” and “ta” from unaspirated sounds such as “ba” and “da.” Study participants heard eight repetitions of these four syllables while inaudible air puffs—simulating aspiration—were directed at the back of the hand or the neck.
When the subjects—66 men and women—were asked to distinguish the syllables, it was found that syllables heard simultaneously with air puffs were more likely to be perceived as aspirated, causing the subjects to mishear “ba” as the aspirated “pa” and “da” as the aspirated “ta.” The brain associated the air puffs felt on skin with aspirated syllables, interfering with perception of what was actually heard.
“Our study shows we can do the same with our skin, “hearing” a puff of air, regardless of whether it got to our brains through our ears or our skin,” says Gick.
Future research may include studies of how audio, visual and tactile information interact to form the basis of a new multi-sensory speech perception paradigm.
via Science Daily