
Perception in a dynamic world
Description
**Faculty Candidate Search - Cognitive Neuroscience**
Perceptual systems are shaped by the world they need to operate in. The sensory world is fundamentally dynamic: the statistical properties of sensory features vary from one environment to the next, and varying task demands change which sensory features are relevant moment to moment. I'll present two projects that explore how perceptual systems are shaped by the dynamic nature of the sensory world. First, using human speech perception as a "model organism", I will present a model for how perceptual systems can cope with structured variability in sensory statistics. Speech perception is interesting because, on the one hand, the statistical properties of the speech signal are highly variable because of differences in talker gender, age, dialect, etc.. On the other hand, this variability is not completely arbitrary: individual talkers have relatively stable ways of producing language, and so past experience is informative about the future. In order to cope with this structured variability, I argue that the speech perception system needs to learn the statistical properties of unfamiliar talkers' speech, while remembering the statistics of familiar talkers and generalizing to similar talkers. Like speech perception itself, these can be modeled as a form of statistical inference, providing a unified framework for understanding the remarkable robustness of human speech perception, as well as how perceptual systems in general can cope with structured variability in the sensory world.
Second, the sensory world is dynamic not only in the sense that the objective statistics change from one situation to the next, but also because each different task requires its own particular kind of perceptual information. Given that the perceptual system has limited neural resources to work with, learning a new task should lead to re-allocation of these resources to more efficiently capture task-relevant information. To test this, we taught people non-linearly separable categories of novel animal-like objects, and compared neural representations measured with fMRI before and after training. Representational similarity analysis showed that neural representations became more aligned with task-relevant features as a result of training, both in mid-level visual areas and in areas associated with the fronto-parietal attention network. Together, these findings highlight the critical role that learning, memory, and attention play in perception in a dynamic world.