Colloquium on the Brain and Cognition with SueYeon Chung (Teuber Lecture) and Teaching Awards
Description
Before this talk, Teaching awards will be presented to the award winners.
Talk Title: Computing with Neural Manifolds: A Multi-Scale Framework for Understanding Biological and Artificial Neural Networks
Abstract: Recent breakthroughs in experimental neuroscience and machine learning have opened new frontiers in understanding the computational principles governing neural circuits and artificial neural networks (ANNs). Both biological and artificial systems exhibit an astonishing degree of orchestrated information processing capabilities across multiple scales - from the microscopic responses of individual neurons to the emergent macroscopic phenomena of cognition and task functions. At the mesoscopic scale, the structures of neuron population activities manifest themselves as neural representations. Neural computation can be viewed as a series of transformations of these representations through various processing stages of the brain. The primary focus of my lab's research is to develop theories of neural representations that describe the principles of neural coding and, importantly, capture the complex structure of real data from both biological and artificial systems.
In this talk, I will present three related approaches that leverage techniques from statistical physics, machine learning, and geometry to study the multi-scale nature of neural computation. First, I will introduce new theories based on statistical physics and convex geometry that connect complex geometric structures that arise from neural responses (i.e., neural manifolds) to the efficiency of neural representations in implementing a task. Second, I will employ these theories to analyze how these representations evolve across scales, shaped by the properties of single neurons, learning dynamics, and the transformations across distinct brain regions. Finally, I will show how these insights extend efficient coding principles beyond early sensory stages, linking representational geometry to efficient task implementations. This framework not only help interpret and compare models of brain data but also offers a principled approach to designing ANN models for higher-level vision. This perspective opens new opportunities for using neuroscience-inspired principles to guide the development of intelligent systems.
Bio: SueYeon Chung is an Assistant Professor of Physics and Applied Mathematics at Harvard University, an Institute Investigator at the Kempner Institute for the Study of Natural and Artificial Intelligence, and a member of the Center for Brain Science. She also holds a part-time appointment at the Flatiron Institute's Center for Computational Neuroscience. Her research sits at the intersection of theoretical neuroscience, statistical physics, and machine learning, with a focus on understanding computation in the brain and artificial neural networks through the geometry of neural representations. Prior to joining Harvard, she was faculty at NYU's Center for Neural Science, a Postdoctoral Fellow at Columbia's Center for Theoretical Neuroscience, and a BCS Fellow in Computation at MIT. She holds a Ph.D. in Applied Physics from Harvard and a B.A. from Cornell, and is a recipient of the Klingenstein-Simons Fellowship (2023) and Sloan Research Fellowship (2024).
Followed by a reception with food and drink in 3rd floor atrium