Skip to yearly menu bar Skip to main content


Poster

Near-Optimality of Contrastive Divergence Algorithms

Pierre Glaser · Kevin Han Huang · Arthur Gretton

West Ballroom A-D #5609
[ ]
Wed 11 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract: We provide a non-asymptotic analysis of the contrastive divergence (CD) algorithm, a training method for unnormalized models. While prior work has established that (for exponential family distributions) the CD iterates asymptotically converge at an $O(n^{-1 / 3})$ rate to the true parameter of the data distribution, we show that CD can achieve the parametric rate $O(n^{-1 / 2})$. Our analysis provides results for various data batching schemes, including fully online and minibatch. We additionally show that CD is near-optimal, in the sense that its asymptotic variance is close to the Cramér-Rao lower bound.

Live content is unavailable. Log in and register to view live content