Timezone: »
Stein variational gradient descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate given distributions, based on a gradient-based update constructed to optimally decrease the KL divergence within a function space. This paper develops the first theoretical analysis on SVGD. We establish that the empirical measures of the SVGD samples weakly converge to the target distribution, and show that the asymptotic behavior of SVGD is characterized by a nonlinear Fokker-Planck equation known as Vlasov equation in physics. We develop a geometric perspective that views SVGD as a gradient flow of the KL divergence functional under a new metric structure on the space of distributions induced by Stein operator.
Author Information
Qiang Liu (Dartmouth College)
More from the Same Authors
-
2016 Poster: Learning Infinite RBMs with Frank-Wolfe »
Wei Ping · Qiang Liu · Alexander Ihler -
2016 Poster: Bootstrap Model Aggregation for Distributed Statistical Learning »
JUN HAN · Qiang Liu -
2016 Poster: Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm »
Qiang Liu · Dilin Wang