Directions in ML: Taking Advantage of Randomness in Expensive Optimization Problems

Optimization is at the heart of machine learning, and gradient computation is central to many optimization techniques. Stochastic optimization, in particular, has taken center stage as the principal method of fitting many models, from deep neural networks to variational Bayesian posterior approximations. Generally, one uses data subsampling to efficiently construct unbiased gradient estimators for stochastic optimization, but this is only one possibility. In this talk, I discuss two alternative approaches to constructing unbiased gradient estimates in machine learning problems. The first approach uses randomized truncation of objective functions defined as loops or limits. Such objectives arise in settings ranging from hyperparameter selection, to fitting parameters of differential equations, to variational inference using lower bounds on the log-marginal likelihood. The second approach revisits the Jacobian accumulation problem at the heart of automatic differentiation, observing that it is possible to collapse the linearized computational graph of, e.g., deep neural networks, in a randomized way such that less memory is used but little performance is lost. These projects are joint work with students Alex Beatson, Deniz Oktay, Joshua Aduol, and Nick McGreivy.

Learn more about the 2020-2021 Directions in ML: AutoML and Automating Algorithms virtual speaker series: https://aka.ms/diml​​ (opens in new tab)

Speaker Details

Ryan Adams is interested in machine learning, artificial intelligence, and computational statistics, with applications across science and engineering. Ryan has broad interests but often works on probabilistic methods and approximate Bayesian inference. Ryan is the director of the Undergraduate Certificate in Statistics and Machine Learning (opens in new tab). He co-founded Whetlab (sold to Twitter in 2015) and formerly co-hosted the Talking Machines (opens in new tab) podcast. Ryan was faculty at Harvard from 2011 to 2016 and was at Twitter and then Google Brain before joining the faculty at Princeton in 2018. He calls his group the Laboratory for Intelligent Probabilistic Systems (LIPS) (opens in new tab).

Date:
Speakers:
Ryan Adams
Affiliation:
Princeton University