Computational Tutorial: Normalization models of attention
Description
By Rachel Denison, Boston University, Psychological & Brain Sciences
Attention is a cognitive process that allows us to prioritize the sensory information that is most relevant for our behavioral goals. In a successful class of computational models of attention, attention biases neural responses through its interaction with normalization—a canonical neural computation that promotes efficient representations across sensory and cognitive systems. Normalization models of attention have provided quantitative explanations for a wide range of findings on how attention affects neural activity and perception, making them a powerful example of a computational framework for top-down modulation. In this tutorial, we will learn about normalization models of attention—including some of our recent work on dynamic attention—with hands-on Matlab exercises.