Timezone: »

 
Poster
Variational Inference via $\chi$ Upper Bound Minimization
Adji Bousso Dieng · Dustin Tran · Rajesh Ranganath · John Paisley · David Blei

Mon Dec 04 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #186 #None
Variational inference (VI) is widely used as an efficient alternative to Markov chain Monte Carlo. It posits a family of approximating distributions $q$ and finds the closest member to the exact posterior $p$. Closeness is usually measured via a divergence $D(q || p)$ from $q$ to $p$. While successful, this approach also has problems. Notably, it typically leads to underestimation of the posterior variance. In this paper we propose CHIVI, a black-box variational inference algorithm that minimizes $D_{\chi}(p || q)$, the $\chi$-divergence from $p$ to $q$. CHIVI minimizes an upper bound of the model evidence, which we term the $\chi$ upper bound (CUBO). Minimizing the CUBO leads to improved posterior uncertainty, and it can also be used with the classical VI lower bound (ELBO) to provide a sandwich estimate of the model evidence. We study CHIVI on three models: probit regression, Gaussian process classification, and a Cox process model of basketball plays. When compared to expectation propagation and classical VI, CHIVI produces better error rates and more accurate estimates of posterior variance.

Author Information

Adji Dieng (Columbia University)
Dustin Tran (Columbia University & OpenAI)
Rajesh Ranganath (Princeton University)

Rajesh Ranganath is a PhD candidate in computer science at Princeton University. His research interests include approximate inference, model checking, Bayesian nonparametrics, and machine learning for healthcare. Rajesh has made several advances in variational methods, especially in popularising black-box variational inference methods that automate the process of inference by making variational inference easier to use while providing more scalable, and accurate posterior approximations. Rajesh works in SLAP group with David Blei. Before starting his PhD, Rajesh worked as a software engineer for AMA Capital Management. He obtained his BS and MS from Stanford University with Andrew Ng and Dan Jurafsky. Rajesh has won several awards and fellowships including the NDSEG graduate fellowship and the Porter Ogden Jacobus Fellowship, given to the top four doctoral students at Princeton University.

John Paisley (Columbia University)
David Blei (Columbia University)

David Blei is a Professor of Statistics and Computer Science at Columbia University, and a member of the Columbia Data Science Institute. His research is in statistical machine learning, involving probabilistic topic models, Bayesian nonparametric methods, and approximate posterior inference algorithms for massive data. He works on a variety of applications, including text, images, music, social networks, user behavior, and scientific data. David has received several awards for his research, including a Sloan Fellowship (2010), Office of Naval Research Young Investigator Award (2011), Presidential Early Career Award for Scientists and Engineers (2011), Blavatnik Faculty Award (2013), and ACM-Infosys Foundation Award (2013). He is a fellow of the ACM.

More from the Same Authors