Machine learning is a set of techniques that allow machines to learn from data and experience, rather than requiring humans to specify the desired behavior by hand. Over the past two decades, machine learning techniques have become increasingly central both in AI as an academic field, and in the technology industry. This course provides a broad introduction to some of the most commonly used ML algorithms. It also serves to introduce key algorithmic principles which will serve as a foundation for more advanced courses, such as CSC412/2506 (Probabilistic Learning and Reasoning) and CSC421/2516 (Neural Networks and Deep Learning).
The first half of the course focuses on supervised learning. We begin with nearest neighbours, decision trees, and ensembles. Then we introduce parametric models, including linear regression, logistic and softmax regression, and neural networks. We then move on to unsupervised learning, focusing in particular on probabilistic models, but also principal components analysis and Kmeans. Finally, we cover the basics of reinforcement learning.
See the course information handout.
There are four sections of the course. Since all sections are fully subscribed, please attend the one you are registered for.
Starting Monday, 11/19, all tutorials will be held in the main lecture room for the corresponding section.
Section 1  Section 2  Section 3  Section 4  
Instructor:  Amirmassoud Farahmand  Juan Carrasquilla  Roger Grosse  Roger Grosse 
Lecture Time:  Monday 111  Wednesday 111  Thursday 46  Friday 111 
Lecture Room:  BA 1170  BA 1160  BA 1170  SF 1101 
Tutorial Time:  Monday 34  Wednesday 34  Thursday 67  Friday 34 
Tutorial Room (by first letter of last name) 
Most weekly homeworks will be due on Wednesdays at 11:59pm. Please see the course information handout for detailed policies (marking, lateness, etc.).
Out  Due  Materials  TA Office Hours  
Homework 1  9/19  9/26 
[Handout] [clean_real.txt] [clean_fake.txt] [clean_script.py] 
Fri 9/21, 45pm, in BA3289 Tues 9/25, 67pm, in PT290C Wed 9/26, 23pm, in BA2283 
Homework 2  9/27  10/3  [Handout] 
Fri 9/28, 45pm, in BA3289 Tues 10/2, 67pm, in BA3201 Wed 10/3, 23pm, in BA2283 
Homework 3  10/4  10/12 
[Handout] [Starter Code] 
Fri 10/5, 45pm, in BA3289 Tues 10/9, 67pm, in BA3201 Wed 10/10, 23pm, in BA2283 Thurs 10/11, 23pm, in BA3201 Fri 10/12, 45pm, in BA3289 
Homework 4  10/20  [Handout]  Wed 10/24, 23pm, in BA2283 Thurs 10/25, 23pm, in BA3201 Fri 10/26, 45pm, in BA3289 Tues 10/30, 67pm, in BA3201 Wed 10/31, 23pm, in BA2283 Thurs 11/1, 23pm, in BA3201 Fri 11/2, 45pm, in BA3289 

Homework 5  11/1  11/14 
[Handout] [q1.py] [hw5digits.zip] [data.py] 
Thurs 11/8, 23pm, in BA3201 Fri 11/9, 45pm, in BA3289 Tues 11/13, 67pm, in BA3201 Wed 11/14, 23pm, in BA2283 
Homework 6  11/15  11/21 
[Handout] [Code and Data] 
Thurs 11/15, 23pm, in BA3201 Fri 11/16, 45pm, in BA3289 Tues 11/20, 67pm, in BA3201 Wed 11/21, 23pm, in BA2283 
Homework 7  11/22  [Handout] 
Tues 11/27, 67pm, in BA3201 Wed 11/28, 23pm, in BA2283 Thurs 11/29, 23pm, in BA3201 Fri 11/30, 45pm, in BA3289 Tues 12/4, 67pm, in BA3201 Wed 12/5, 23pm, in BA2283 

Homework 8  11/29  not marked 
[Handout] [Code Solution] 
N/A 
The course will have a midterm and a final exam. Everybody is required to take both, including graduate students.
The midterm will be held from 67pm on Friday, October 19.
The final exam time will be from 710pm on Tuesday, December 11.
Here is a tentative schedule, which will likely change as the course goes on. Each "Lecture" corresponds to 50 minutes, so each 2hour lecture session will cover 2 of them.
Suggested readings are just that: resources we recommend to help you understand the course material. They are not required, i.e. you are only responsible for the material covered in lecture.
ESL = The Elements of Statistical Learning, by Hastie, Tibshirani, and Friedman.
MacKay = Information Theory, Inference, and Learning Algorithms, by David MacKay.
Barber = Bayesian Reasoning and Machine Learning, by David Barber.
Bishop = Pattern Recognition and Machine Learning, by Chris Bishop.
Sutton and Barto = Reinforcement Learning: An Introduction, by Sutton and Barto.
Topic  Dates  Slides  Suggested Readings  
Lecture 1  Introduction  9/6, 9/7, 9/10, 9/12 
[Slides] 
ESL: Chapter 1 
Lecture 2  Nearest Neighbours  9/6, 9/7, 9/10, 9/12 
[Slides] 
ESL: 2.12.3, and 2.5 
Lecture 3  Decision Trees  9/13, 9/14, 9/17, 9/19 
[Slides] 
ESL: 9.2 
Lecture 4  Ensembles I  9/13, 9/14, 9/17, 9/19 
[Slides] 
ESL: 2.9, 8.7, 15 
Lecture 5  Ensembles II  9/20, 9/21, 9/24, 9/26 
[Slides] 
ESL: 10.1 
Lecture 6  Linear Regression  9/20, 9/21, 9/24, 9/26 
[Slides] 
ESL: 2.3, 3.13.2.1 
Lecture 7  Linear Classification I  9/27, 9/28, 10/1, 10/3 
[Slides]  csc321 notes 
Lecture 8  Linear Classification II  9/27, 9/28, 10/1, 10/3 
[Slides]  csc321 notes 
Lecture 9  SVMs and Boosting  10/4, 10/5, 
[Slides]  Note: lecture will be videotaped because of Thanksgiving 
Lecture 10  Neural Networks I  10/4, 10/5, 
[Slides]  Notes: part 1, part 2 Note: lecture will be videotaped because of Thanksgiving 
Lecture 11  Neural Networks II  10/11, 10/12, 10/15, 10/17 
[Slides]  Notes: part 1, part 2 
Lecture 12  Principal Components Analysis  10/11, 10/12, 10/15, 10/17 
[Slides]  ESL: 14.5.1 
Lecture 13  Probabilistic Models I  10/18, 10/19, 10/22, 10/24 
[Slides] 
notes MacKay, Chapter 23 
Lecture 14  Probabilistic Models II  10/18, 10/19, 10/22, 10/24 
[Slides]  MacKay, Chapters 21, 24 
Lecture 15  KMeans  10/25, 10/26, 10/29, 10/31 
[Slides] 
MacKay: Chapter 20 Bishop: 9.1 
Lecture 16  ExpectationMaximization I  10/25, 10/26, 10/29, 10/31 
[Slides] 
notes Barber: 20.120.3 Bishop: 9.29.4 
Lecture 17  ExpectationMaximization II  11/1, 11/2, 11/12, 11/14 
[see L16]  
Lecture 18  Matrix Factorizations  11/1, 11/2, 11/12, 11/14 
[Slides]  
Lecture 19  Bayesian Linear Regression  11/15, 11/16, 11/19, 11/21 
[Slides]  Bishop: 3.3 
Lecture 20  Gaussian Processes  11/15, 11/16, 11/19, 11/21 
[Slides]  Bishop: 6.16.2, 6.4.16.4.3 
Lecture 21  Reinforcement Learning I  11/22, 11/23, 11/26, 11/28 
[Slides]  Sutton and Barto: 3, 4.1, 4.4, 6.16.5 
Lecture 22  Reinforcement Learning II  11/22, 11/23, 11/26, 11/28 
see L21  
Lecture 23  Algorithmic Fairness  11/29, 11/30, 12/3, 12/5 
[Slides]  TBA 
Lecture 24  Closing Thoughts  11/29, 11/30, 12/3, 12/5 
[Slides]  TBA 
Dates  Topic  Materials  
Tutorial 1  9/6, 9/7, 9/10, 9/12  probability review  [Slides] 
Tutorial 2  9/13, 9/14, 9/17, 9/19 
linear algebra review: matrix multiplication, linear systems NumPy basics 
[Linear algebra slides] [NumPy presentation] [NumPy exercises] 
Tutorial 3  9/20, 9/21, 9/24, 9/26  gradient descent 
[Slides] [Lecture ipynb] [Worksheet ipynb] 
Tutorial 4  9/27, 9/28, 10/1, 10/3  linear algebra review: projection, eigenvalues, SVD 
[Slides] [NumPy presentation] [NumPy exercises] 
10/4, 10/5, 10/8, 10/10  No tutorial (Thanksgiving)  
Tutorial 5  10/11, 10/12, 10/15, 10/17  midterm review  [Slides] 
10/18, 10/19, 10/22, 10/24  No tutorial  
Tutorial 6  10/25, 10/26, 10/29, 10/31  Markov chain Monte Carlo 
[Slides] [ipynb] 
Tutorial 7  11/1, 11/2, 11/12, 11/14  learning and inference with multivariate Gaussians 
[Slides] 
Tutorial 8  11/15, 11/16, 11/19, 11/21  Bayesian Optimization 
[Slides] [ipynb 1] [ipynb 2] 
Tutorial 9  11/22, 11/23, 11/26, 11/28  reinforcement learning 
[Slides] [ipynb] 
Tutorial 10  11/29, 11/30, 12/3, 12/5  final exam review  [Slides] 
The easiest option is probably to install everything yourself on your own machine.
If you don't already have python 3, install it.
We recommend some version of Anaconda (Miniconda, a nice lightweight conda, is probably your best bet). You can also install python directly if you know how.
Optionally, create a virtual environment for this class and step into it. If you have a conda distribution run the following commands:
conda create name csc411 source activate csc411
Use pip
to install the required packages
pip install scipy numpy autograd matplotlib jupyter sklearn
All the required packages are already installed on the Teaching Labs machines.