Computation Tutorial: Bayesian Inference in Generative Models
Description
Bayesian inference is ubiquitous in models and tools across cognitive science and neuroscience. While the mathematical formulation of Bayesian models in terms of prior and likelihood is simple, exact Bayesian inference is intractable for most models of interest. In this tutorial, we will cover a range of approximate inference methods, including sampling-based methods (e.g. MCMC, particle filters) and variational inference, and describe how neural networks can be used to speed up these methods. We will also introduce probabilistic programming languages, which provide tools for black-box Bayesian inference in complex models. Hands-on exercises include implementing inference algorithms for simple models and/or implementing complex models in a probabilistic programming language.
Additional Info
After the tutorial, slides and resources will be posted on the computational tutorials stellar page:
- Slides, references, and exercises: https://stellar.mit.edu/S/project/bcs-comp-tut/materials.html
- Videos: http://cbmm.mit.edu/videos?field_video_grouping_tid[0]=781