I am a Ph.D. student in Computer Science at University of Toronto, supervised by Chris Maddison. Previously, I obtained my Bachelor degree in Information Engineering from Zhejiang University in July 2020. In summer 2019, I was a visiting student at UCLA, where I worked with Cho-Jui Hsieh.
I’m currently working on the intersection of machine learning and information theory, with a focus on representation learning, domain generalization, and neural compression. In particular, I want to understand how to learn useful and trustworthy representations from data, and motivate more efficient and principled representation learning methods. I’m also interested in implicit deep learning and generative modeling.
Selected Publications [Full List]
* below denotes equal contribution
- Optimal Representations for Covariate ShiftIn International Conference on Learning Representations (ICLR), 2022
- Improving Lossless Compression Rates via Monte Carlo Bits-Back CodingIn International Conference on Machine Learning (ICML), 2021 [Long talk]
- Learning to Learn by Zeroth-Order OracleIn International Conference on Learning Representations (ICLR), 2020
- Conference reviewer: NeurIPS (2020, 2021), ICLR (2021, 2022), ICML (2021)
- Workshop reviewer: NeurIPS DGMs Applications Workshop (2021)
Selected Awards & Honors
- DiDi Gruduate Student Award, 2021
- Computer Science 50th Anniversary Graduate Scholarship, 2020.
- CHU Kochen Scholarship (highest honor at Zhejiang University), 2019.
- Top 10 students at Zhejiang University, 2019.
- Cross-disciplinary Scholars in Science and Technology (CSST), UCLA, 2019.
- National Scholarship (top 1.5%), 2017, 2018, 2019.
- Meritorious Winner, Interdisciplinary Contest in Modeling (ICM), 2018.