HRC Seminar: Josh McDermott

Starts:
10:30 am on Friday, April 14, 2017
Ends:
1:00 pm on Friday, April 14, 2017
Location:
44 Cummington Mall, Room 203, Boston MA
Title: Computational Neuroimaging of Human Auditory Cortex. Abstract: Just by listening, humans can determine who is talking to them, whether a window in their house is open or shut, or what their child dropped on the floor in the next room. This ability to derive information from sound is enabled by a cascade of neuronal processing stages that transform the sound waveform entering the ear into cortical representations that are presumed to make behaviorally important sound properties explicit. Although much is known about the peripheral processing of sound, the auditory cortex remains poorly understood, with little consensus even about its coarse-scale organization in humans. This talk will describe our recent efforts using computational neuroimaging methods to better understand the cortical representation of sound. I will describe the use of “model-matched” stimuli to test whether existing models explain cortical neural responses, and the development of new models of cortical computation via task-optimization of deep neural networks. We have harnessed these methods and models to reveal representational transformations occurring between primary and non-primary cortex that may support the recognition of speech, music, and other real-world sound signals.