Skip to yearly menu bar Skip to main content


Poster

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

Sinho Chewi · Thibaut Le Gouic · Chen Lu · Tyler Maunu · Philippe Rigollet

Poster Session 5 #1498

Abstract:

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the kernelized gradient flow of the chi-squared divergence. Motivated by this perspective, we provide a convergence analysis of the chi-squared gradient flow. We also show that our new perspective provides better guidelines for choosing effective kernels for SVGD.

Chat is not available.