Will Grathwohl

Email: wgrathwohl[ at ]cs[ dot ]toronto[ dot ]edu
Github: wgrathwohl
Linkedin:
will-grathwohl

Drawing inspiration from many of my great math professors from undergrad, this website will be poorly made, and have lots improperly formatted HTML.

I completed my undergraduate degree in Mathematics at MIT in 2014. I am now a graduate student in the Machine Learning Group here at the University of Toronto.

I am co-supervised by Richard Zemel and David Duvenaud.

The main goal of my research is to get machine learning methods to work well with considerably less labeled data.
To accomplish this goal I am interested in semi-supervised learning methods as well as incorporating (mainly discrete) structure into inference models.

Papers

I just started so more to come...

Preprints

Backpropagation through the Void: Optimizing control variates for black-box gradient estimation: Will Grathwohl, Dami Choi, Yuhuai Wu, Geoff Roeder, David Duvenaud.
We present a general method for estimating the gradients of expectation of functions of random variables.
Our method can be applied to distributions of discrete random variables or even when the function being optimized is not known!