Skip to yearly menu bar Skip to main content


Poster

A Non-Asymptotic Analysis for Stein Variational Gradient Descent

Anna Korba · Adil Salim · Michael Arbel · Giulia Luise · Arthur Gretton

Poster Session 5 #1497

Abstract: We study the Stein Variational Gradient Descent (SVGD) algorithm, which optimises a set of particles to approximate a target probability distribution $\pi\propto e^{-V}$ on $\R^d$. In the population limit, SVGD performs gradient descent in the space of probability distributions on the KL divergence with respect to $\pi$, where the gradient is smoothed through a kernel integral operator. In this paper, we provide a novel finite time analysis for the SVGD algorithm. We provide a descent lemma establishing that the algorithm decreases the objective at each iteration, and rates of convergence for the averaged Stein Fisher divergence (also referred to as Kernel Stein Discrepancy) . We also provide a convergence result of the finite particle system corresponding to the practical implementation of SVGD to its population version.

Chat is not available.