Skip to yearly menu bar Skip to main content


Oral
in
Workshop: NeurIPS 2022 Workshop on Score-Based Methods

Targeted Separation and Convergence with Kernel Discrepancies

Alessandro Barp · Carl-Johann Simon-Gabriel · Mark Girolami · Lester Mackey


Abstract: Kernel Stein discrepancies (KSDs) are maximum mean discrepancies (MMDs) that leverage the score information of distributions, andhave grown central to a wide range of applications. In most settings, these MMDs are required to $(i)$ separate a target $\mathrm{P}$ from other probability measures or even $(ii)$ control weak convergence to $\mathrm{P}$. In this article we derive new sufficient and necessary conditions that substantially broaden the known conditions for KSD separation and convergence control, and develop the first KSDs known to metrize weak convergence to $\mathrm{P}$. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.

Chat is not available.