CSC 311 Fall 2020: Introduction to Machine Learning

Overview

Machine learning (ML) is a set of techniques that allow computers to learn from data and experience, rather than requiring humans to specify the desired behaviour by hand. ML has become increasingly central both in AI as an academic field, and in industry. This course provides a broad introduction to some of the most commonly used ML algorithms. It also serves to introduce key algorithmic principles which will serve as a foundation for more advanced courses, such as CSC412/2506 (Probabilistic Learning and Reasoning) and CSC413/2516 (Neural Networks and Deep Learning).

We start with nearest neighbors, the canonical nonparametric model. We then turn to parametric models: linear regression, logistic regression, softmax regression, and neural networks. We then move on to unsupervised learning, focusing in particular on probabilistic models, but also principal components analysis and K-means. Finally, we cover the basics of reinforcement learning.

Announcements

Where and When

Each section of this course corresponds to one lecture and one tutorial time. Class will be held synchronously online every week, including a combination of lecture and tutorial exercises. Students are encouraged to attend both the lecture and tutorial each week. There will be two mandatory tests held during the scheduled class time.

Sections Lecture Time Tutorial Time
LEC0101, LEC0102, LEC2001 Monday 11-1 Monday 3-4
LEC0201, LEC0202, LEC2001 Thursday 4-6 Thursday 7-8

Online delivery. Lectures will be delivered synchronously via Zoom, and recorded for asynchronous viewing by enrolled students. Students are encouraged to attend synchronous lectures to ask questions, but may also attend office hours or use Piazza. All information about attending virtual lectures, tutorials, and office hours will be sent to enrolled students through Quercus.

Course videos and materials belong to your instructor, the University, and/or other source depending on the specific facts of each situation, and are protected by copyright. In this course, you are permitted to download session videos and materials for your own academic use, but you should not copy, share, or use them for any other purpose without the explicit permission of the instructor. For questions about recording and use of videos in which you appear please contact your instructor.

Teaching Staff

Instructors

Juhan Bae Roger Grosse Chris Maddison Silviu Pitis
Office Hours Thursday 2-4 Monday 1-3 Monday 6-8 Friday 10-12
Sections LEC0102 LEC0201 LEC0101, LEC2001 LEC0202
Email Instructors Only csc311-2020-09@cs.toronto.edu

Teaching Assistants

We will use Piazza for the course forum. If your question is about the course material and doesn't give away any hints for the homework, please post to Piazza so that the entire class can benefit from the answer.

Linear Algebra Review Midterm 1 Prep Midterm 2 Prep
Office hours Thu 10/8 8-9pm
Fri 10/9 9-10am
Tue 10/13 10:30am-12:30pm
Fri 10/16 1:30-3:30pm
Fri 10/16 4-5pm
Mon 10/19 10-11am
Tue 10/20 8-9pm
Wed 10/21 2-3pm
Thu 10/22 9-10am
Fri 11/27 2-3pm
Mon 11/30 10-11am
Tue 12/1 6-8pm
Wed 12/2 4-5pm
Email TAs & Instructors csc311-2020-09-tas@cs.toronto.edu

Homeworks

Most weekly homeworks will be due at 11:59pm on Wednesdays, and submitted through MarkUs. Please see the course information handout for detailed policies (marking, lateness, etc.).

# Out Due Materials TA Office Hours
1 9/17 9/30 [Handout]
[Data]
Fri 9/25 5-9pm
Mon 9/28 9-11am
Tues 9/29 10:30am-12:30pm
Wed 9/30 10:30am-12:30pm
2 10/1 10/14 [Handout]
[Code & Data]
[Code V2 & Data]
Wed 10/07 4-6pm
Fri 10/09 4-6pm
Mon 10/12 9-11am
Tues 10/13 7-9pm
Wed 10/14 2-4pm
3 10/15 11/4 [Handout]
[Code]
Wed 10/28 4-6pm
Fri 10/30 4-6pm
Mon 11/02 9-11am
Tues 11/03 7-9pm
Wed 11/04 2-4pm
4 11/5 11/25 [Handout]
[Handout v2]
[Code & Data]
Wed 11/18 4-6pm
Fri 11/20 4-6pm
Mon 11/23 9-11am
Tues 11/24 7-9pm
Wed 11/25 2-4pm

Tests

The course will have two tests, each with a duration of 1 hour and held during the normal class time. The higher of the two marks will count for 15%, and the lower mark will count for 10%.

The lecture schedule on both days will be somewhat unusual; see details below. The reason for this is that some students will be in other time zones, and we wanted to make sure any time zone has at least one exam time which is at least acceptable.

You must take the test with your assigned section, unless you have prior permission from the instructor.

# Thursday Section Monday Section Covers
1 10/22
Test: 7-8pm
Lecture: 4-6pm
10/26
Test: 11am-noon
Lecture: noon-1pm, 3-4pm
Up through Lecture 6
2 12/3
Test: 7-8pm
Lecture: 4-5pm
12/7
Test: 11am-noon
Lecture: noon-1pm
Up through Lecture 10

Schedule

This is a tentative schedule, which will likely change as the course goes on.

Suggested readings are optional; they are resources we recommend to help you understand the course material. All of the textbooks listed below are freely available online.

Bishop = Pattern Recognition and Machine Learning, by Chris Bishop.
ESL = The Elements of Statistical Learning, by Hastie, Tibshirani, and Friedman.
MacKay = Information Theory, Inference, and Learning Algorithms, by David MacKay.
Barber = Bayesian Reasoning and Machine Learning, by David Barber.
Sutton and Barto = Reinforcement Learning: An Introduction, by Sutton and Barto.

# Dates Topic Materials Suggested Readings
1 9/10, 9/14 Lecture: Introduction, Nearest Neighbours
Tutorial: Probability
Lecture: [Slides]
Tutorial: [Slides]

ESL: 1, 2.1-2.3, 2.5
Domingos, 2012. A few useful things to know about machine learning.
Breiman, 2001. Statistical Modeling: The Two Cultures

2 9/17, 9/21 Lecture: Linear Methods for Regression, Optimization
Tutorial: Linear Algebra Review
Lecture: [Slides]
Tutorial: [Slides]
Bishop: 3.1
ESL: 3.1 - 3.2
Course notes: Linear Regression, Calculus
3 9/24, 9/28 Lecture: Logistic Regression, Multiclass Classification, Optimization
Tutorial: Optimization
Lecture: [Slides]
Tutorial:
Bishop: 4.1, 4.3
ESL: 4.1-4.2, 4.4, 11
Course notes: Linear Classifiers, Training a Classifier
4 10/1, 10/5 Lecture: Neural Networks
Tutorial: PyTorch
Lecture: [Slides]
Tutorial:
Bishop: 5.1-5.3
Course notes: Multilayer Perceptrons, Backpropagation
5 10/08, 10/12 Lecture: Decision Trees, Bias-Variance Decomposition
Tutorial: TBA
Note: Lecture will be held as usual on Thanksgiving. It will be recorded as usual, so you are welcome to watch the recording instead.
Lecture: [Slides]
Tutorial:
Bishop: 3.2
ESL: 2.9, 9.2
Course notes: Generalization
6 10/15, 10/19 Lecture: Bagging, Boosting
Tutorial: Midterm Review
Lecture: [Slides]
Tutorial: [Slides]
ESL: 8.7, 10.1-10.5
7 10/22, 10/26 Lecture: Probabilistic Models
Tutorial: None (in-class test)
Lecture: [Slides]
ESL: 2.6.3, 6.6.3, 4.3.0
MacKay: 21, 23, 24
Course notes: Probabilistic Models
8 10/29, 11/2 Lecture: Probabilistic Models cont'd; Principal Component Analysis
Tutorial: Eigenvectors, PCA
Lecture: [Slides]
Tutorial: [Notes]
Bishop: 12.1
9 11/5, 11/16 Lecture: PCA cont'd; Matrix Completion; Autoencoders
Tutorial: Final Project
Lecture: [Slides]
Tutorial: [Slides] [Colab]
ESL: 14.5.1
10 11/19, 11/23 Lecture: k-Means, EM Algorithm
Tutorial: TBA
Lecture: [Slides]
Tutorial: [Slides]
MacKay: 20
Bishop: 9
Barber: 20.1-20.3
Course notes: Mixture Modeling
11 11/27, 11/30 Lecture: Reinforcement learning
Tutorial: Test 2 Review
Lecture: [Slides]
Tutorial: [Slides]
Sutton and Barto: 3, 4.1, 4.4, 6.1-6.5
12 12/3, 12/7 Lecture: AlphaGo and Game Playing
Tutorial: None (in-class test)
Lecture: [Slides]

Final Project

25% of your total mark is allocated to a final project, which will require you to apply several algorithms to a challenge problem and to write a short report analyzing the results. The deadline for final project is December 15th (final evaluation period). You can find the full project requirements here and starter code here.

Paper Readings

5% of your total mark is allocated to reading a set of classic machine learning papers. We hope these papers are both interesting and understandable given what you learn in this course. The 5 points are allocated on an honor system; at the end of the term, you'll check a box to indicate that you've done the readings. You don't need to hand anything in, and the readings will not be tested on the exam.
  1. P. Viola and M. Jones. Rapid object detection using a boosted cascade of simple features. CVPR 2001. [pdf]
  2. A. Krizhevsky, I. Sutskever, and G. E. Hinton. ImageNet classification with deep convolutional neural networks. NIPS 2012. [pdf]
  3. R. Salakhutdinov and A. Mnih. Probabilistic matrix factorization. NIPS 2007. [pdf]
  4. B. A. Olshausen and D. J. Field. Sparse coding with an overcomplete basis set: a strategy employed by V1? Vision Research, 1997. [pdf]
  5. V. Mnih et al. Human-level control through deep reinforcement learning. Nature, 2015. [article]

Computing Resources

For the homework assignments, we will use Python 3, and libraries such as NumPy, SciPy, and scikit-learn. You have two options:
  1. The easiest option is probably to install everything yourself on your own machine.

    1. If you don't already have python 3, install it.

      We recommend some version of Anaconda (Miniconda, a nice lightweight conda, is probably your best bet). You can also install python directly if you know how.

    2. Optionally, create a virtual environment for this class and step into it. If you have a conda distribution run the following commands:

          conda create --name csc311
          source activate csc311
    3. Use pip to install the required packages

          pip install scipy numpy autograd matplotlib jupyter sklearn
  2. All the required packages are already installed on the Teaching Labs machines.