Theoretical and empirical results in the neural networks literature demonstrate that effective learning at a real-world scale requires changes to synaptic weights that approximate the gradient of a global loss function. For neuroscientists, this means that the brain must have mechanisms for communicating loss gradients between regions, either explicitly or implicitly. Here, I describe our research into potential means of communicating loss gradients using the unique properties of apical dendrites in pyramidal neurons. I will present modelling work showing that, in principle, ensembles of pyramidal neurons could using the temporal derivative of their activity to estimate cost gradients. I will also show how this can be learned using the discontinuities that spikes induce. Finally, I will discuss specific experimental predictions that arise from these theories.
This event is co-organized by the CBMM Trainee Leadership Council.