
Leo Kozachkov Thesis Defense: Achieving Stable And Brain-Like Dynamics in Recurrent Models via Contraction Analysis
Description
In-person location: Picower Seminar Room, 46-3310
Virtually on Zoom: https://mit.zoom.us/j/99097971829
Abstract: The brain consists of many interconnected networks, each with time-varying and complex dynamics. Despite this, neural activity tends to converge to reproducible sequences of states. How the brain achieves these complex-yet-stable dynamics is unknown. We address this problem using contraction analysis: a set of tools from the dynamical systems and control theory literature. Loosely, a contracting dynamical system is one whose trajectories converge towards a common—potentially time-varying and complicated—trajectory. We apply contraction analysis to recurrent neural networks (RNNs), as well as RNNs of RNNs, to derive constraints on synaptic plasticity rules and weight matrices such that the resulting models are provably contracting. By parametrizing these constraints for optimization in standard deep learning libraries, we show that contraction-constrained networks achieve high performance on sequential processing benchmark tasks (e.g., sequential CIFAR-10), as well as high similarity with frontal lobe neural activity recorded from a behaving non-human primate. Our work—both theoretical and experimental—suggests that stability-constrained recurrent architectures yield promising models of neural activity as the scope of experimental neuroscience expands into studying multiple dynamic, interacting brain areas.