We not only hear with our ears, but also through our skin, according to a new study.
The finding, based on experiments in which participants listened to certain syllables while puffs of air hit their skin, suggests our brains take in and integrate information from various senses to build a picture of our surroundings.
Along with other recent work, the research flips the traditional view of how we perceive the world on its head.
“[That’s] very different from the more traditional ideas, based on the fact that we have eyes so we think of ourselves as seeing visible information, and we have ears so we think of ourselves as hearing auditory information. That’s a little bit misleading,” study researcher Bryan Gick of the University of British Columbia, Vancouver, told LiveScience.
“A more likely explanation is that we have brains that perceive rather than we have eyes that see and ears that hear.”
With such abilities, Gick views humans as “whole-body perceiving machines.”
The research, funded by the Natural Sciences and Engineering Council of Canada and the National Institutes of Health, is detailed in the Nov. 26 issue of the journal Nature.
How we perceive
Gick’s work builds on past studies showing, for instance, that we can see sound and hear light, even if we don’t consciously realize it. Other studies show if you observe another person’s lips moving and think that other is speaking, your brain’s auditory regions would light up, Gick said.
Scientists had explained such sensing prowess as the result of experience, as we see and hear people speaking all the time and so it’d be only natural to learn how to integrate what we see with what we hear.
The alternative would be an innate ability. And so Gick and his colleague Donald Derrick, also of the University of British Columbia, studied two senses that aren’t generally paired – auditory and tactile – to figure out the root of perception.
How skin hears
The team focused on aspirated sounds, such as “pa” and “ta” that involve an inaudible burst of air when spoken, as well as unaspirated sounds, such as “ba” and “da.”
Blind-folded participants listened to recordings of a male voice saying each of the four syllables and had to press a button to indicate which sound they heard (pa, ta, ba or da). Participants were divided into three groups of 22, with one group hearing the syllables while a puff of air was blown onto their hand, the other had air blown onto the neck, and the control group heard the sounds with no air.
About 10 percent of the time when air was puffed onto the skin, participants mistakenly perceived the unaspirated syllables as being their aspirated equivalents. So when the guy said “ba,” such participants would indicate they heard “pa.” The control group didn’t show such mistaken perceptions.
A follow-up experiment in which participants got a tap on the skin rather than a puff of air showed no such mix-up between aspirated and unaspirated sounds.
Next, Gick is working with scientists from the University of California, San Francisco, to figure out how the brain allows such multi-sense integration.
– via Yahoo