General Info
The lectures allow us to explain new material, how it relates to the rest of the course (and what you've learned in other courses), and to show examples of applying the material. Lecture notes that go into more details will be made available on this page.
Students often learn a lot from working with one another. You are encouraged to meet with other students from class for this purpose.
Lectures roughly fall into four main topics: randomized algorithms; linear programming and primal-dual algorithms; approximation algorithms; streaming algorithms.
Tentative Schedule of Lectures
Week | Topic | Readings |
Week 1: Jan 6 –12 |
Monte Carlo Algorithms: Global Min-Cut |
From "Algorithm Design" by Kleinberg and Tardos: Min Cut |
Week 2: Jan 13–19 |
Finish Min Cut Las Vegas Algorithms: Closest Pair of Points |
Karger-Stein Algorithm From "Algorithm Design" by Kleinberg and Tardos: Closest Pair of Points Finding triangles |
Week 3: Jan 20–Jan 26 |
Approximate Near Neighbours Fingerprinting for String Matching |
Approximate Near Neighbour Search CLRS Section 32.2. Lecture Notes on Karp-Rabin |
Week 4: Jan 27–Feb 2 |
Finish Approximate Near Neighbours |
Approximate Near Neighbour Search Streaming Algorithms |
Week 5: Feb 3–9 | Streaming Algorithms |
Variance and Chebyshev Worksheet Streaming Algorithms |
Week 6: Feb 10–16 |
Finish Streaming Algorithns Random Walks and Markov Chains |
Streaming Algorithms Section 3.3. of this book chapter cover Randomized Selection. Markov Chains You can also read the intro section of Chapter 4 of this book. Section 4.8 of the book covers PageRank. Linear Algebra Review Sheet |
Feb 17–23 | NO CLASS: Reading Week | |
Week 7: Feb 24–March 1 |
Random Walks and Markov Chains |
Markov Chains Couplings Worksheet |
Week 8: March 2–8 | Linear Programming | Linear Programming |
Week 9: March 9–15 |
LP Duality and Complementary Slackness |
Linear Programming Examples of LP formulations. |
Week 10: March 16–22 | Matchings and the Hungarian Algorithm |
Goemans's lecture notes. Slides |
Week 11: March 23–March 29 |
Finish Matchings Deterministic and Randomized Rounding Algorithms |
Goemans's matchings lecture notes. Matchings slides Sections 1.7, 5.1 of the Williamson and Shmoys book. All of Chapter 1 of the book is recommended. Rounding slides |
Week 12: March 30–April 3 | Derandomization and Chernoff bounds |
Section 5.2, 5.10-5.12 of the Williamson and Shmoys book. Derandomization slides Chernoff bounds slides Chapter 4 of the Motwani-Raghavan book has more information on tail inequalities. |
Suggested Exercises
In addition to the exercises below, always attempt the exercises in the posted readings and lecture notes above.- Suggested exercises from Williamson and Shmoys: 1.4, 1.5, 5.1, 5.2, 5.3, 5.6, 5.7, 5.14
- Do the exercises from Goemans's matchings lecture notes. -->
- For Markov Chains, try exercises 4.1, 4.3, 4.4, 4.7 after Chapter 4 of this book.
- For Karp Rabin fingerprints, try the exercise after Lecture 7 of Jeff Erickson's notes, and the relevant problems in CLRS section 32.2.
- Suggested exercises for Min Cut:
- From Motwani and Raghavan's Randomized Algorithms (available through the U of T library at http://go.utlib.ca/cat/8230181) try the exercises 10.13 - 10.15 after Chapter 10.
- From Jeff Erickson's lecture notes, try the exercises after Lecture 13.
- After doing exercise 1 from Erickson's notes, try to give an O(m) time algorithm to execute the contraction algorithm, without using the Klein-Karger-Tarjan MST algorithm.
Learning Objectives
By the end of this course, you should be able to:
- Distinguish between Monte Carlo and Las Vegas algorithms.
- Design and analyze randomized algorithms for basic problems in graph theory (e.g. minimum cut). In particular, you should have a deep understanding of Karger's Contraction algorithm, including its analysis and how to implement it efficiently.
- Design and analyze locality sensitive hash functions for different distance metrics, and use them for approximate near neighbour search and related problems.
- Use sampling to estimate sizes of sets, and apply sampling in streaming and other algorithms.
- Define basic concepts in Markov Chains: stationary distribution, irreducible and aperiodic Markov chains, time-reversible Markov chains. You should be able to determine when a distribution, or, respectively, a Markov chain satisfies these definitions.
- State the Fundamental Theorem of Markov Chains, and use coupling arguments to bound the mixing time of Markov chains.
- Use the Metropolis Hastings algorithm to design Markov chains with a given stationary distribution.
- Define basic terms in polyhedral geometry, like vertex, face, facet, polytope.
- Model optimization problems as linear programs, and derive the dual of a linear program.
- State the complementary slackness theorem, and understand the analysis of primal-dual algorithms, like the Hungarian algorithm.
- Design and analyze approximation algorithms via deterministic and randomized rounding of linear programming relaxations.
- Derandomize randomized approximation algorithms via the method of conditional expectations.
- State the Chernoff bound and Hoeffding's Inequality, and use them to analyze rounding and sampling algorithms.
Further Reading
Below you can find some surveys and research articles related to the topics in this course. The research articles may be challenging for you, and this is normal. As a start, you can just read the introduction and try to understand the statements of results.
- Article on smoothed analysis, which gives one explanation why the simplex algorithm is very efficient in practice, despite its exponential worst-case running time.
- Matchings: Paths, Trees and Flowers, Edmonds's paper which gave the first efficient algorithm for maximum cardinality matching in general graphs. Check the section which argues why "efficient" can be abstracted as "polynomial time". If you want to learn the algorithm, a better place to start may be these lecture notes.
- Markov Chains: survey by Diaconis with lots of pointers to papers and books.
- Streaming Algorithms: old survey; survey on graph streaming algorithms.
- Near Neighbor Search: old survey; new survey; lecture on some of my work.
- Minimum Cut: practical algorithms; deterministic algorithm.