A Hearing Aid That Listens to the Brain
Trying to hold a conversation in a packed restaurant can be a challenge for anyone—it’s hard to discern a companion’s words through the clattering of plates and the din of other voices. But the problem is even tougher for someone with a hearing aid. Although some devices can cancel out noise, they can’t always know which of the sounds in a noisy environment the wearer wants to hear. The result is that current aids make both distractions and desired sounds louder.
A project led by Barbara Shinn-Cunningham, professor of biomedical engineering and cognitive neural systems and co-director of BU’s Center for Computational Neuroscience & Neural Technology, is focused on making hearing aids smarter by applying knowledge about how the brain attends to different sounds. Hearing loss affects tens of millions of Americans, says Shinn-Cunningham, and along with tinnitus (ringing in the ears), it is one of the two most common health complaints of soldiers. The U.S. Department of Defense has awarded her work a five-year, $3 million grant to work toward developing these sophisticated hearing aids.
Sound waves enter the inner ear, says Shinn-Cunningham, and the cochlea transforms the incoming sounds into a multidimensional display of constituent frequencies by means of specialized hair cells that respond to different frequencies. These signals are then transmitted to the brain, where they pass through multiple regions for processing. At every step along this path, there are opportunities for the brain to interpret the sound information coming in, and it’s this interpretive process that Shinn-Cunningham studies.
How do we know, for instance, that a set of sounds comes from a single object? How do we choose to focus on one stream of sound over another? How do distracting noises and hearing loss interfere with our ability to process sentences? Ongoing work in her lab has focused on understanding how people listen to complex information such as overlapping sounds. Initially focusing on behavioral studies that identified what kinds of information people extract from noisy situations and how they attend to different sounds, Shinn-Cunningham is now incorporating brain imaging into her research as well, to help track and understand the specific patterns of brain activity that allow us to make sense of sounds.
Her team uses both electroencephalography (EEG) and magnetoencephalography (MEG) to see synchronous neural activity and to localize the activity to different parts of the brain. Their first task, says Shinn-Cunningham, is to “understand what the signature activity is over time” when the brain is attending to sounds. “We hope we can control how a hearing aid operates based on what a listener wants to attend, using brain activity,” she explains, to create a device that filters out whatever sound is unimportant at a given moment.