Hierarchical computation in auditory cortex
Description
Just by listening, humans can infer a host of useful information about events in the world. Much is known about peripheral auditory processing, but auditory cortex remains poorly understood, particularly in computational terms. Here I will talk about my recent work exploring computational properties of cortical responses, revealing a hierarchy in human auditory cortex.
I will first describe our work studying the neural basis of a central challenge in everyday listening: hearing sources of interest embedded in real-world background noises (e.g., a bustling coffee shop, crickets chirping, heavy rain hitting pavement). The extent to which auditory cortex was robust to backgrounds revealed a hierarchy potentially related to the proposed “core-belt-parabelt” distinction from macaque anatomical studies.
Next, I will discuss how we developed an improved model of cortical responses by optimizing a hierarchical convolutional neural network (CNN) to perform a difficult real-world auditory task. Despite not being trained to fit neural data, the CNN’s hidden layers predicted auditory cortical responses substantially better than a standard cortical model, with distinct CNN layers mimicking response in distinct parts of auditory cortex.
Together, this work suggests a multi-staged hierarchy of auditory cortical computation, and it begins to characterize properties of those computations.