Eyes and ears

Two senses better than one for learning

September 25, 2006
Twitter Facebook
Aaron Seitz, CAS research assistant professor of psychology with equipment used in his research on multisensory learning. Photo by Frank Curran

Let’s say you’re a budding entomologist preparing for an exam requiring the visual identification of different cricket species. Would it help you learn if you heard a recording of the different cricket chirps while you studied the corresponding cricket images, even though sound will play no part in the test?  

Most neuroscientists would say no, according to Aaron Seitz, a College of Arts and Sciences research assistant professor of psychology. They would argue that learning to recognize subtle visual differences involves areas of the brain specific to processing visual input. But based on recent research, Seitz believes the conventional wisdom is wrong, that the brain processes sensory information in a much more integrated fashion, and his findings could influence a wide range of things, from teaching methods to the experimental paradigms of neuroscience.

“Everybody intuitively knows that our physical environment is multisensory,” says Seitz. The smell, look, and taste of food, for example, all contribute to the experience of a meal. Likewise, people often use sound, sight, and touch to learn how to navigate around a new environment. But in the world of neuroscience, he says, “people try to simplify as much as possible. As a result, there are a huge number of studies investigating the processing of a single sense, in isolation.”

Seitz wanted to put the validity of that isolation to the test. So in research published in the July issue of Current Biology, he and Ladan Shams, an assistant professor of psychology at UCLA, trained people to do a “low-level visual task” often used in neuroscience experiments — detecting motion in a particular direction. Specifically, for an hour a day over several days, the trainees watched dots move around a computer screen, a varying portion of them going in a single direction and the rest scattering randomly. As training progressed, the subjects got better at detecting the direction of motion amid the randomness, an example of a process known as perceptual learning.

In addition to watching the dots, half the subjects simultaneously listened to a stream of white noise that shifted in volume from left speaker to right speaker, or vice versa, to suggest motion in a given direction (much the way the noise of a passing train can indicate its direction of travel). The direction of the white noise was masked with randomness just as the dots were, but the underlying direction of dots and noise was always the same. After the training, both groups were tested on how well they could recognize the direction of the moving dots (without sound), and Seitz found that those who’d had both auditory and visual training learned the visual task faster and better.

These results “suggest a need to look at richer environments when investigating brain processing,” says Seitz, such as showing study subjects videos of real-world events rather than just basic shapes, colors, and motion patterns. In addition, he says, the growing understanding of how senses are integrated in perception and learning could lead to more multisensory training for real-world tasks and new rehabilitation exercises to help patients recover sensory function after brain injury.

But first, Seitz and other researchers will need to find out more specifics about when multisensory inputs improve learning and when they don’t. “Complicated isn’t always better,” he notes. And another question is whether the impacts are similar for perceptual learning beyond detecting motion, or in learning higher-level, more cognitive tasks.

Seitz sums it up this way: “These first results are definitely more of a demonstration than a conclusion.”

Explore Related Topics:

  • Share this story

Share

Eyes and ears