Colloquium Details
**Special Time 11:00am** Composing differentiable procedures for modeling, inference, and optimization
Speaker: David Duvenaud, Harvard University
Location: Warren Weaver Hall 1302
Date: February 22, 2016, 11 a.m.
Host: Subhash Khot
Synopsis:
Much recent success in machine learning has been through optimizing simple feedforward procedures, such as neural networks, using gradients. Surprisingly, many complex procedures such as message passing, filtering, inference, and even optimization itself can be meaningfully differentiated though as well. Composing these procedures lets us build sophisticated models that generalize existing methods but retain their good properties. We'll show applications to chemical design, gradient-based tuning of optimization procedures, and training procedures that don't require cross-validation.
Speaker Bio:
David Duvenaud is a postdoc in the Harvard Intelligent Probabilistic Systems group, working with Prof. Ryan Adams on model-based optimization, synthetic chemistry, and neural networks. He did his Ph.D. at the University of Cambridge with Carl Rasmussen and Zoubin Ghahramani. Previous to that, he worked on machine vision both with Kevin Murphy at the University of British Columbia, and later at Google Research. David also co-founded Invenia, an energy forecasting and trading firm.
Notes:
In-person attendance only available to those with active NYU ID cards.