Zining Zhu (朱子宁)

PhD Student, University of Toronto


My long-term goal is to build reliable and trustworthy NLP systems that can stably deploy across domains. I am interested in understanding the mechanisms and abilities of neural NLP systems to encode and use knowledge, with language as a medium.
Following are some ongoing projects:
Probing: Methods for diagnostic analysis on deep NLP systems.
Reasoning: Studying the abilities of neural systems to use knowledge in human-like manners.
Discourse & pragmatics: Analyzing discourse from interpretable and scalable dimensions.
Society: Exploring the sentiment, emotion, morality, and other aspects of social media texts.
[Interested in these projects?] [or an Interpretable NLP seminar?]


Filter by keyword:



University of Toronto 2019 - 2024 (Expected)
Ph.D student in Computer Science
Supervisor: Frank Rudzicz
University of Toronto 2014 - 2019
Bachelor in Engineering Science, Robotics option.

Work / Research

Tencent Jarvis Lab, Machine Learning Engineering Intern, May 2019 - Aug 2019
Neural language models and pre-training techniques.
Winterlight Labs, Research Software Engineer, Sep 2017 - Sep 2018
Automatic detection of dementia from narrative speeches.
TripAdvisor, Software Engineering Intern, June - Aug, 2017
Android applications and Java API.
Dynamic Systems Lab at UTIAS, Research Assistant, May - Sep, 2016
Enhancing drone controllers using deep neural networks.


  • CSC401/2511 (Toronto) Natural Languages Computing - Co-instructor - Winter 2022
  • CSC2515 (Toronto) Introduction to Machine Learning - TA - Fall 2021
  • CSCC24 (Toronto) Principles of Programming Languages - TA - Summer 2021
  • CSC148 (Toronto) Introduction to Computer Science - TA - Summer 2021
  • CSC401/2511 (Toronto) Natural Languages Computing - TA - Winter 2021
  • CSC309 (Toronto) Web Programming - TA - Fall 2020
  • CSC401/2511 (Toronto) Natural Languages Computing - TA - Winter 2020
  • ECE324 (Toronto) Introduction to Machine Intelligence - TA - Fall 2019
  • CSC180 (Toronto) Introduction to Computer Science - TA - Fall 2016


  • Reviewing for conferences: ACL (2020-2021), EMNLP (2020 - 2021), NAACL (2021), AAAI (2021), ICLR (2022)
  • Reviewing for journals:
    • IEEE Journal of Biomedical and Health Informatics
    • Computer Methods & Programs in Biomedicine


  • Vector Institute Research Grant, Institutional, 2020, 2021
  • Dean’s List, Institutional, undergraduate years 2014 - 2019
  • Engineering Science Research Opportunity Program (ESROP) fellowship, Institutional, 2016 summer
  • Chinese Physics Olympics (CPhO) 1st Prize, Provincial, 2013

Selected Talks

  • Invited talk. Probing neural language models, AISC Recent Advances in NLP talk, Aug 15, 2021
  • Invited talk. Improving the neural NLP model performances with linguistic probes, Zhi-Yi Technology Advances in NLP talk, Nov 20, 2020
  • Invited talk. An information theoretic view on selecting linguistic probes, Tsinghua University AI TIME talk, Oct 30, 2020
  • Spotlight talk. Examining the rhetorical capacities of neural language models, Vector Institute NLP Symposium, Sep 16, 2020
  • Invited talk. RecitalBoard: Efficient pre-training methods for language modeling, Tencent Jarvis Lab, Shenzhen, China, Aug 5, 2019
  • Invited talk. Detecting cognitive impairments with machine learning, UTMIST, Toronto, Canada, Nov 20, 2018
  • Invited talk. Probabilistic Graphical Models, UTADA, Toronto, Canada, Oct 21, 2017

Media Coverage