The Imagen text-to-image generative model won Outstanding Paper Award at NeurIPS 2022!
Our offline RL work "Why so pessimistic?" will be presented at NeurIPS 2022!
The Imagen text-to-image generative model will be presented at NeurIPS 2022!
Presenting our line of work on robotic magnetic assembly as well as Imagen text-to-image generative models.
Our work "Blocks Assemble!" wil be presented at ICML 2022!
Our effort on Sim2Real transfer of bimanual magnetic assembly policies is available on Arxiv!
We are very excited about our work "Why so pessimistic? Estimating uncertainties for offline RL through ensembles, and why their independence matters." (with Shane Gu and Ofir Nachum) which was accepted at the Offline RL Workshop at NeurIPS 2021.
Our work "EMaQ: Expected-Max Q-Learning Operator for Simple Yet Effective Offline and Online RL" (with Dale Schuurmans and Shane Gu) was accepted at ICML 2021.
Our paper "A Divergence Minimization Perspective on Imitation Learning Methods" (with Richard Zemel and Shane Gu) received the Best Paper Award at the Conference on Robot Learning (CoRL) 2019!
This semester I am interning with Corey Lynch and Pierre Sermanet at Google Brain Robotics in Mountainview
Our paper "A Divergence Minimization Perspective on Imitation Learning Methods" (with Richard Zemel and Shane Gu) was accepted as an oral at CoRL 2019!
Our paper "SMILe: Scalable Meta Inverse Reinforcement Learning through Context-Conditional Policies" (with Shane Gu and Richard Zemel) was accepted as a poster at NeurIPS 2019!
Our paper "SMILe: Scalable Meta Inverse Reinforcement Learning through Context-Conditional Policies" (with Shane Gu and Richard Zemel) was accepted as an oral presentation to the Imitation, Intent, and Interaction (I3) Workshop at ICML 2019!
Our paper "Interpreting Imitation Learning Methods Under a Divergence Minimization Perspective" (with Shane Gu and Richard Zemel) was accepted to the Imitation, Intent, and Interaction (I3) Workshop at ICML 2019!
Our paper "Interpreting Imitation Learning Methods Under a Divergence Minimization Perspective" (with Shane Gu and Richard Zemel) was accepted to the Deep Generative Models for Highly Structured Data Workshop at ICLR 2019!