Will Grathwohl

Email: wgrathwohl[ at ]cs[ dot ]toronto[ dot ]edu
Github: wgrathwohl
Linkedin:
will-grathwohl
CV

Drawing inspiration from many of my great math professors from undergrad, this website will be poorly made, and have lots improperly formatted HTML.

I completed my undergraduate degree in Mathematics at MIT in 2014. I am now a graduate student in the Machine Learning Group here at the University of Toronto.

I am co-supervised by Richard Zemel and David Duvenaud.

The main goal of my research is to get machine learning methods to work well with considerably less labeled data.
To accomplish this goal I am interested in semi-supervised learning methods as well as incorporating (mainly discrete) structure into inference models.

I spent this past summer interning at OpenAI where I am worked with Durk Kingma and Ilya Sutskever on generative models.
I have been using new techniques for incorporating ODEs into deep learning models to make more expressive and parameter efficeint normalizing flows.

Papers

Conference

Backpropagation through the Void: Optimizing control variates for black-box gradient estimation: Will Grathwohl, Dami Choi, Yuhuai Wu, Geoff Roeder, David Duvenaud.
ICLR 2018.
We present a general method for estimating the gradients of expectation of functions of random variables.
Our method can be applied to distributions of discrete random variables or even when the function being optimized is not known!

Workshops

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models: Will Grathwohl*, Ricky T. Q. Chen*, Jesse Bettencourt, Ilya Sutskever, David Duvenaud. (*equal contribution)
Symposium on Advances in Approximate Bayesian Inference 2018. Oral Presentation.

Modeling Global Class Structure Leads to Rapid Integration of New Classes: Will Grathwohl, Eleni Triantafillou, Xuechen Li, David Duvenaud and Richard Zemel.
NIPS 2018 Workshop on Meta-Learning
NIPS 2018 Workshop on Continual Learning

Training Glow with Constant Memory Cost: Xuechen Li, Will Grathwohl.
NIPS 2018 Workshop on Bayesian Deep Learning

Gradient-Based Optimization of Neural Network Architecture: Will Grathwohl, Elliot Creager, Kamyar Ghasemipour, Richard Zemel.
ICLR 2018 Workshop.

Backpropagation through the Void: Optimizing control variates for black-box gradient estimation: Will Grathwohl, Dami Choi, Yuhuai Wu, Geoff Roeder, David Duvenaud.
NIPS 2017 Deep Reinforcement Learning Symposium. Oral Presentation. Video of my talk found here.

Preprints

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models: Will Grathwohl*, Ricky T. Q. Chen*, Jesse Bettencourt, Ilya Sutskever, David Duvenaud. (*equal contribution)

Awards and Fellowships

Borealis AI Graduate Fellowship: A $50,000, 2 year fellowship funding research in AI. Funded by the Royal Bank of Canada.
Huawei Prize: A financial award based on academic and research performance.
ICLR 2018 Travel Award