The language of probability allows us to coherently and automatically account for uncertainty. This course will teach you how to build, fit, and do inference in probabilistic models. These models let us generate novel images and text, find meaningful latent representations of data, take advantage of large unlabeled datasets, and even let us do analogical reasoning automatically. This course will teach the basic building blocks of these models and the computational tools needed to use them.
Assignment 1: 15% (Feb 8)
Assignment 2: 15% (Mar 15)
Assignment 3: 20% (Apr ~5)
Midterm: 20% ( Feb 14)
Final: 30%
Lecture: Introduction (Jan 8)
Tutorial: None
Reading:
Murphy: Chapters 1 and 2
Lecture: Basic Classifiers (Jan 16)
Tutorial: Basic Supervised Learning and Probability (Jan 17)
Reading:
Murphy: Chapters 3, 4, 7-9 (excluding * sections)
Assignment 1 Due Feb 8 at 11:59pm
LaTeX Template for Solutions and LaTeX Style File
Lecture: Directed Graphical Models (Jan 22)
Tutorial: Stochastic Optimization (Jan 24)
Reading:
Murphy: Chapters 10-12 (excluding * sections)
Lecture: Undirected Graphical Models (Jan 29)
Tutorial: Automatic Differentiation (Jan 31)
Reading:
Lecture: Exact Inference (Feb 5)
Tutorial: Markov Random Fields (Feb 7)
Reading:
Murphy: Chapter 20
MacKay: Chapter 21.1 (worked example with numbers)
MacKay: Chapter 16 (Message Passing, including soldier intuition)
MacKay: Chapter 26 Exact Inference on Factor Graphs
Assignment 1 Due (Feb 8)
Sample Midterm: Sample Problems for the Midterm
Lecture: Variational Inference (Feb 12)
Lecture Slides Slimmed: Variational Inference (Thanks Trevor Ablett)
Tutorial: Midterm (Feb 14)
Reading:
Reading Week: No Lecture or Tutorial
Assignment 2 Due March 15 at 11:59pm
Lecture: Sampling and Monte Carlo Methods (Feb 26)
Tutorial: More on Exact Inference (Feb 28)
Reading:
MacKay Chapter 29!
Lecture: Sequential Data and Time-Series Models (Mar 5)
Tutorial: Gradient-based MCMC(Mar 7)
Lecture Readings:
Lecture: Stochastic Variational Inference (Mar 12)
Tutorial: Gradient-based Optimization for Discrete Distributions (Mar 14 - Slides based on Chris Maddison's Field's Talk)
Reading:
Tutorial Readings:
Stochastic Computation Graphs
Gradient Estimators
Williams, Ronald J. "Simple statistical gradient-following algorithms for connectionist reinforcement learning." Reinforcement Learning. Springer, Boston, MA, 1992. 5-32.
Kingma, Diederik P., et al. "Semi-supervised learning with deep generative models." Advances in Neural Information Processing Systems. 2014.
Maddison, Chris J., Andriy Mnih, and Yee Whye Teh. "The concrete distribution: A continuous relaxation of discrete random variables." arXiv preprint arXiv:1611.00712 (2016).
Jang, Eric, Shixiang Gu, and Ben Poole. "Categorical reparameterization with gumbel-softmax." arXiv preprint arXiv:1611.01144 (2016).
Tucker, George, et al. "REBAR: Low-variance, unbiased gradient estimates for discrete latent variable models." Advances in Neural Information Processing Systems. 2017.
Grathwohl, Will, et al. "Backpropagation through the Void: Optimizing control variates for black-box gradient estimation." arXiv preprint arXiv:1711.00123 (2017).
Log-derivative and reparameterization tricks
Lecture: Variational Autoencoders (Mar 19)
Tutorial: Practicalities of SVI (Mar 21)
Reading:
Fun Extensions
Assignment 3 Due April 5 at 11:59pm
Lecture: Generative Adversairal Networks (Mar 26)
Tutorial: Expectation Maximization (Mar 28)
Lecture Readings:
Lecture: Flow-based Models (Apr 2)
Tutorial: Bayesian Optimization (Apr 4)
Assignment 3 Due (Apr 5)
Lecture Readings:
Exam Study Topics: Topics to Focus for Final Exam Study