The language of probability allows us to coherently and automatically account for uncertainty. This course will teach you how to build, fit, and do inference in probabilistic models. These models let us generate novel images and text, find meaningful latent representations of data, take advantage of large unlabeled datasets, and even let us do analogical reasoning automatically. This course will teach the basic building blocks of these models and the computational tools needed to use them. The class will have a major project component.

- Lectures: Tuesdays 3:00-5:00pm in MP 102
- Tutorials: Thursday 1:00-2:00 in LM 162
- Piazza discussion board
- Instructor: David Duvenaud
- Email: duvenaud@cs.toronto.edu Put "CSC412" in the subject - but are you sure you don't want to ask it on Piazza?
- Office hours: Tuesdays 1:00-2:00pm, Room 384 Pratt
- Teaching assistants: Geoffrey Roeder, Jake Snell, Assimakis Kattis, David Madras

**January 10: Introduction**

**January 12: Tutorial: Basic supervised learning and probability**

- Reading: Chapter 2 of David Mackay's textbook

**January 17: Basic Probabilistic Generative and Discriminative models**

Reading: Chapter 3 of David Mackay's textbook

Code examples:

**January 19: Tutorial: Stochastic optimization**

**January 24: Directed Graphical Models**

**January 26: Tutorial: Automatic differentiation** Autodiff demo slides Implementation slides

- Reading: Roger Grosse's slides on backprop

**January 31: Undirected Graphical Models**

**February 2: Tutorial: Markov Random Fields**

**February 7: Exact Inference**

**February 9: Tutorial: Junction-tree algorithm** notes slides

**February 10: Assignment 1 due, submitted through Markus.**

**February 14: Variational Inference**

- Reading: Tutorial on variational inference

**February 16: Midterm exam**

Things to know for midterm:

- Bayes' rule, sum and product rules of probability, expectations
- Conditioning, normalization, marginalization
- Exponential family distributions, maximum likelihood
- Logistic regression, Naive Bayes
- Converting graphical models to pdfs and back
- Determining conditional independence
- DAGs vs UGMs vs factor graphs
- Computational complexity of inference

**February 18 to 26: Reading week, no classes**

**February 28: Sampling and Monte Carlo methods**

- Readings: Iain Murray's tutorial on MCMC

**March 1: Tutorial: Gradient-based MCMC**

- Hamiltonian Monte Carlo and Langevin Dynamics

**Assignment 2 due March 12, submitted through Markus**

- python starter code
Some Python and Numpy resources, from Roger Grosse's neural networks course:

- Anaconda provides an installer for Python and Numpy for Windows, Linux, and Mac.
- Numpy tutorial
- Learn X in Y minutes can get you up to speed in Python if you already know other languages.

**March 7: Sequential data and time-series models**

**March 9: Tutorial: REINFORCE and differentiating through discrete variables**

- Related paper: Gradient Estimation Using Stochastic Computation Graphs

**March 13: Last day to drop course.**

**March 14: Stochastic Variational Inference**

**March 14: Tutorial: Practicalities of SVI**

**March 21: Variational Autoencoders**

- Example code
- Reading:

**March 29: Gaussian processes**

**March 31: Tutorial: Bayesian Optimization**

**April 4: Assignment 3 due April 4th, submitted through Markus**

**April 4: Generative Adversarial Networks**

**April 13: Project due, submitted through Markus**