Can You Repeat That?

in ENG Spotlight-Research, NEWS

In a recent study published in PNAS, Professor Barbara Shinn-Cunningham (BME) and her coauthors explored why some people with normal hearing have difficulty understanding conversations in complex auditory environments such as bars.
In a recent study published in PNAS, Professor Barbara Shinn-Cunningham (BME) and her coauthors explored why some people with normal hearing have difficulty understanding conversations in complex auditory environments such as bars.

Overexposure to loud music from iPods and other sound systems is known to cause permanent hearing loss, but new research suggests that even before an audiologist can detect the damage, such exposure may interfere with everyday communication. An individual may have “normal hearing” based on standard tests that measure the quietest sound a person can hear, yet still have trouble understanding what her best friend is telling her at a crowded bar.

The problem may lie in the “first-responder” portion of the auditory system, which encodes the detailed structure of incoming sounds before the brain processes them further, according to a study led by Professor Barbara Shinn-Cunningham (BME) that appeared in the August 15 online edition of Proceedings of the National Academy of Sciences.

In the study, Shinn-Cunningham and her coauthors, BME PhD students Dorea Ruggles and Hari Bharadwaj—all members of the Boston University Hearing Research Center—reveal significant variations in how well listeners with normal hearing filter out distracting sound sources and focus on a desired one in complex auditory environments. The researchers correlate these variations with undiagnosed differences in how the most peripheral part of the auditory system encodes sound in the brain, and speculate that defective encoding may be due to nerve fiber loss resulting from overexposure to high-volume noise sources common in modern society.

“Up to now, we didn’t know if such problems were due to impairments in the cortex, where decision-making and language processing takes place, or even earlier in the auditory system, where basic sensory information is first encoded,” Shinn-Cunningham explained. “Our results suggest that the fidelity of early sensory encoding in the subcortical brain determines the ability to communicate in challenging settings.”

In the short term, the study’s findings may enable audiologists to diagnose auditory processing deficiencies, and thus advise patients on how to compensate for them in complex social settings from sports stadiums to corporate boardrooms. In the long term, the research may also lead to more effective hearing aid technologies.

Shinn-Cunningham and her collaborators arrived at their findings by evaluating subjects’ ability to discriminate simple properties of audible sound, and subsequently obtaining physiological measures of early sensory coding in subcortical regions of their brains.

The researchers first tested 42 normal-hearing adults, aged 18-55, on their ability to report digits (1, 2, 3, etc.) spoken by a recorded male voice. While listening to a central sound stream, two others consisting of simultaneous digits spoken by the same male speaker played from the left and right. The subjects’ ability to understand the central sounds differed considerably, and decreased when echoes and reverberation were added to the recordings. Selected listeners next watched a silent movie while presented with repeated beeps that they were instructed to ignore. Meanwhile, the researchers used electrodes attached to subjects’ scalps to record electrical activity from early, subcortical portions of the auditory pathway in response to the easily audible beeps.

The result: Those listeners who were best at understanding speech in the earlier test also produced the strongest scalp voltage responses. In fact, their peripheral auditory systems responded most strongly to sound even when they were not paying attention to it.

With ongoing funding from the National Institutes of Health and Department of Defense, the researchers next plan to explore the potential consequences of subcortical deficiencies on the cortex’s ability to process auditory information.

“With further tests, we hope to tease apart how peripheral (subcortical) and central (cortical) deficits contribute to communication impairments, ultimately leading to new approaches to combat the social isolation that often ensues,” said Shinn-Cunningham.