Timezone: »

Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference
Geoffrey Roeder · Yuhuai Wu · David Duvenaud

Wed Dec 06 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #180 #None

We propose a simple and general variant of the standard reparameterized gradient estimator for the variational evidence lower bound. Specifically, we remove a part of the total derivative with respect to the variational parameters that corresponds to the score function. Removing this term produces an unbiased gradient estimator whose variance approaches zero as the approximate posterior approaches the exact posterior. We analyze the behavior of this gradient estimator theoretically and empirically, and generalize it to more complex variational distributions such as mixtures and importance-weighted posteriors.

Author Information

Geoffrey Roeder (University of Toronto)
Yuhuai Wu (University of Toronto)
David Duvenaud (University of Toronto)

David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David also co-founded Invenia, an energy forecasting and trading company.

More from the Same Authors