Contact information

InstructorsSerena Jeblee, Sean Robertson, and Frank Rudzicz
OfficeQuercus
Office hoursM 10h-11h
Emailcsc401-2021-01@cs (add the toronto.edu suffix)
Forum (Piazza)Piazza
Email policyFor non-confidential inquiries, consult the Piazza forum first. Otherwise, for confidential assignment-related inquiries, consult the TA associated with the particular assignment. Emails sent with appropriate subject headings and from University of Toronto email addresses are most likely not to be redirected towards junk email folders, for example.
Back to top

Meeting times

LecturesMF 9h-10h on BB Collaborate
TutorialsW 9h-10h on BB Collaborate
Back to top

Course outline

This course presents an introduction to natural language computing in applications such as information retrieval and extraction, intelligent web searching, speech recognition, and machine translation. These applications will involve various statistical and machine learning techniques. Assignments will be completed in Python. All code must run on the 'teaching servers'.

The theme for this year is speech and text analysis in a post-truth society.

Prerequisites: CSC207H1 / CSC209H1 ; STA247H1 / STA255H1 / STA257H1
Recommended Preparation: MAT221H1 / MAT223H1 / MAT240H1 is strongly recommended. For advice, contact the Undergraduate Office.

The course information sheet is available here.

Back to top

Readings for this course

Optional Foundations of Statistical Natural Language ProcessingC. Manning and H. Schutze
Optional Speech and Language ProcessingD. Jurafsky and J.H. Martin

Readings on the theme of the course


Supplementary reading

TopicTitleAuthor(s)Misc
SmoothingAn Empirical Study of Smoothing Techniques for Language ModelingStanley F Chen and Joshua Goodman
Hidden Markov modelsA Tutorial on Hidden Markov Models and Selected Applications in Speech RecognitionLawrence R. Rabiner
Sentence alignmentA Program for Aligning Sentences in Bilingual CorporaWilliam A. Gale and Kenneth W. Church
Seq2SeqSequence to Sequence Learning with Neural NetworksIlya Sutskever, Oriol Vinyals, Quoc V. Le
AttentionAttention Is All You NeedAshish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
Attention-based NMTEffective Approaches to Attention-based Neural Machine TranslationMinh-Thang Luong, Hieu Pham, Christopher D. Manning
Phonetic alphabetsASCII Phonetic Symbols for the World's Languages: WorldbetJames L. Hieronymus
Gaussian mixture modelsRobust Text-Independent Speaker Identification Using Gaussian Mixture Speaker ModelsDouglas A. Reynolds and Richard C. Rose
Transformation-based learningTransformation-Based Error-Driven Learning and Natural Language Processing: A Case Study in Part-of-Speech TaggingEric Brill
Feature selectionCorrelation-based feature selection for machine learningMark A. Hall
Feature selectionminimum redundancy, maximum relevanceHanchuan Peng
Sentence boundariesSentence boundariesRead J, Dridan R, Oepen S, Solberg LJ
Back to top

Evaluation policies

General
You will be graded on three homework assignments and a final assessment. The relative proportions of these grades are as follows:
Assignment with lowest mark15%
Assignment with median mark20%
Assignment with highest mark25%
Final assessment40%
Graduate students enrolled in CSC2511 will have the option of undertaking a course project (instead of the assignments), in teams of at most two students, for 60% of the course grade (the final exam, worth 40%, is still required). Information on the course project can be found here.
Lateness
A 10% deduction is applied to late homework one minute after the due time. Thereafter, an additional 10% deduction is applied every 24 hours up to 72 hours late at which time the homework will receive a mark of zero. No exceptions will be made except in emergencies, including medical emergencies, at the instructor's discretion.
Final
A mark of at least D- on the final exam is required to pass the course. In other words, if you receive an F on the final exam then you automatically fail the course, regardless of your performance in the rest of the course.
Collaboration and plagiarism
No collaboration on the homeworks is permitted. The work you submit must be your own. `Collaboration' in this context includes but is not limited to sharing of source code, correction of another's source code, copying of written answers, and sharing of answers prior to submission of the work (including the final exam). Failure to observe this policy is an academic offense, carrying a penalty ranging from a zero on the homework to suspension from the university. See Academic integrity at the University of Toronto.
Back to top

Syllabus

The following is an estimate of the topics to be covered in the course and is subject to change.

  • Introduction to corpus-based linguistics
  • N-gram models
  • Entropy and information theory
  • Hidden Markov models
  • Machine translation (statistical and neural)
  • Neural language models and word embedding
  • Articulatory and acoustic phonetics
  • Automatic speech recognition
  • Speech synthesis
  • Information retrieval
  • Dialogue and chatbots

Calendar

11 JanuaryFirst lecture
18 JanuaryLast day to add CSC 2511
24 JanuaryLast day to add CSC 401
10 FebruaryAssignment 1 due
15--19 FebruaryReading week -- no lectures or tutorial
1 MarchLast day to drop CSC 2511
10 MarchAssignment 2 due
15 MarchLast day to drop CSC 401
9 AprilLast lecture
9 AprilAssignment 3 due [extension, was 3 April]
9 AprilProject report due
23 AprilFinal 'assessment'

See Dates for undergraduate students.

See Dates for graduate students.

Back to top

News and announcements

  • FIRST LECTURE: 11 January at 10h on BB Collaborate.
  • FIRST TUTORIAL: 13 January at 10h on BB Collaborate.

Back to top

Lecture materials

Assigned readings give you more in-depth information on ideas covered in lectures. You will not be asked questions relating to readings for the assignments, but they will be useful in studying for the final exam.

Provided PDFs are ~ 10% of their original size for portability, at the expense of fidelity.

  1. Introduction.
    • Date: 11 Jan.
    • Reading: Manning & Schütze: Sections 1.3-1.4.2, Sections 6.0-6.2.1
  2. Corpora, language models, Zipf, and smoothing.
    • Date: 15 Jan.
    • Reading: Manning & Schütze: Section 1.4.3, Section 6.1-6.2.2, Section 6.2.5, Sections 6.3
    • Reading: Jurafsky & Martin: 3.4-3.5
  3. Features and classification.
    • Date: 22 Jan
    • Reading: Manning & Schütze: Section 16.1, 16.4
    • Reading: Jurafsky & Martin (2nd ed): Sections 5.1-5.5
  4. Entropy and decisions.
    • Date: 25, 29 Jan
    • Reading: Manning & Schütze: Sections 2.2, 5.3-5.5
  5. Neural language models.
    • Date: 1, 5 Feb
  6. Machine translation.
    • Date: 8, 12 Feb
    • Reading: Manning & Schuütze Sections 13.0, 13.1.2, 13.1.3, 13.2, 13.3, 14.2.2
  7. Hidden Markov Models.
    • Date: 26 Feb; 1, 5 Mar
    • Reading: Manning & Schütze: Section 9.2-9.4.1 (an alternative formulation)
  8. Speech.
    • Date: 8 Mar
  9. ASR.
    • Date: 19 Mar
  10. Speech synthesis.
    • Date: 26 Mar
  11. Intelligent dialogue agents.
    • Date: 5 Apr
  12. Review.
    • Date: 9 Apr

Tutorial materials


Assignments

Here is the ID template that you must submit with your assignments. Here is the MarkUs link you use to submit them.

Project

The course project is an optional replacement for the assignments available to graduate students in CSC2511.

Back to top

Midterm

Here is the not-for-marks midterm: Midterm 2020.

2020 lectures

Here are the lectures from last year's iteration of this course.

Back to top