Timezone: »

 
Poster
Advances in Black-Box VI: Normalizing Flows, Importance Weighting, and Optimization
Abhinav Agrawal · Daniel Sheldon · Justin Domke

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1478

Recent research has seen several advances relevant to black-box VI, but the current state of automatic posterior inference is unclear. One such advance is the use of normalizing flows to define flexible posterior densities for deep latent variable models. Another direction is the integration of Monte-Carlo methods to serve two purposes; first, to obtain tighter variational objectives for optimization, and second, to define enriched variational families through sampling. However, both flows and variational Monte-Carlo methods remain relatively unexplored for black-box VI. Moreover, on a pragmatic front, there are several optimization considerations like step-size scheme, parameter initialization, and choice of gradient estimators, for which there are no clear guidance in the existing literature. In this paper, we postulate that black-box VI is best addressed through a careful combination of numerous algorithmic components. We evaluate components relating to optimization, flows, and Monte-Carlo methods on a benchmark of 30 models from the Stan model library. The combination of these algorithmic components significantly advances the state-of-the-art "out of the box" variational inference.

Author Information

Abhinav Agrawal (UMass Amherst)

Ph.D. student working to scale probabilistic inference.

Daniel Sheldon (University of Massachusetts Amherst)
Justin Domke (University of Massachusetts, Amherst)

More from the Same Authors