Skip to yearly menu bar Skip to main content


Poster

Asymptotics of Alpha-Divergence Variational Inference Algorithms with Exponential Families

François Bertholom · randal douc · François Roueff

East Exhibit Hall A-C #4105
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Recent works in Variational Inference have examined alternative criteria to the commonly used exclusive Kullback-Leibler divergence. Encouraging empirical results have been obtained with the family of alpha-divergences, but few works have focused on the asymptotic properties of the proposed algorithms, especially as the number of iterations goes to infinity. In this paper, we study a procedure that ensures a monotonic decrease in the alpha-divergence. We provide sufficient conditions to guarantee its convergence to a local minimizer of the alpha-divergence at a geometric rate when the variational family belongs to the class of exponential models. The sample-based version of this ideal procedure involves biased gradient estimators, thus hindering any theoretical study. We propose an alternative unbiased algorithm, we prove its almost sure convergence to a local minimizer of the alpha-divergence, and a law of the iterated logarithm. Our results are exemplified with toy and real-data experiments.

Live content is unavailable. Log in and register to view live content