Timezone: »
Boosting variational inference (BVI) approximates an intractable probability density by iteratively building up a mixture of simple component distributions one at a time, using techniques from sparse convex optimization to provide both computational scalability and approximation error guarantees. But the guarantees have strong conditions that do not often hold in practice, resulting in degenerate component optimization problems; and we show that the ad-hoc regularization used to prevent degeneracy in practice can cause BVI to fail in unintuitive ways. We thus develop universal boosting variational inference (UBVI), a BVI scheme that exploits the simple geometry of probability densities under the Hellinger metric to prevent the degeneracy of other gradient-based BVI methods, avoid difficult joint optimizations of both component and weight, and simplify fully-corrective weight optimizations. We show that for any target density and any mixture component family, the output of UBVI converges to the best possible approximation in the mixture family, even when the mixture family is misspecified. We develop a scalable implementation based on exponential family mixture components and standard stochastic optimization techniques. Finally, we discuss statistical benefits of the Hellinger distance as a variational objective through bounds on posterior probability, moment, and importance sampling errors. Experiments on multiple datasets and models show that UBVI provides reliable, accurate posterior approximations.
Author Information
Trevor Campbell (UBC)
Xinglong Li (The University of British Columbia)
More from the Same Authors
-
2022 Poster: Bayesian inference via sparse Hamiltonian flows »
Naitong Chen · Zuheng Xu · Trevor Campbell -
2022 Poster: Fast Bayesian Coresets via Subsampling and Quasi-Newton Refinement »
Cian Naik · Judith Rousseau · Trevor Campbell -
2022 Poster: Parallel Tempering With a Variational Reference »
Nikola Surjanovic · Saifuddin Syed · Alexandre Bouchard-Côté · Trevor Campbell -
2021 Workshop: Your Model is Wrong: Robustness and misspecification in probabilistic modeling »
Diana Cai · Sameer Deshpande · Michael Hughes · Tamara Broderick · Trevor Campbell · Nick Foti · Barbara Engelhardt · Sinead Williamson -
2020 Poster: Bayesian Pseudocoresets »
Dionysis Manousakas · Zuheng Xu · Cecilia Mascolo · Trevor Campbell -
2019 Poster: Sparse Variational Inference: Bayesian Coresets from Scratch »
Trevor Campbell · Boyan Beronov