Timezone: »

Fast Black-box Variational Inference through Stochastic Trust-Region Optimization
Jeffrey Regier · Michael Jordan · Jon McAuliffe

Wed Dec 06 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #182 #None

We introduce TrustVI, a fast second-order algorithm for black-box variational inference based on trust-region optimization and the reparameterization trick. At each iteration, TrustVI proposes and assesses a step based on minibatches of draws from the variational distribution. The algorithm provably converges to a stationary point. We implemented TrustVI in the Stan framework and compared it to two alternatives: Automatic Differentiation Variational Inference (ADVI) and Hessian-free Stochastic Gradient Variational Inference (HFSGVI). The former is based on stochastic first-order optimization. The latter uses second-order information, but lacks convergence guarantees. TrustVI typically converged at least one order of magnitude faster than ADVI, demonstrating the value of stochastic second-order information. TrustVI often found substantially better variational distributions than HFSGVI, demonstrating that our convergence theory can matter in practice.

Author Information

Jeff Regier (UC Berkeley)
Michael Jordan (UC Berkeley)
Jon McAuliffe (UC Berkeley)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors