Poster
Near-Optimality of Contrastive Divergence Algorithms
Pierre Glaser · Kevin Han Huang · Arthur Gretton
West Ballroom A-D #5609
Abstract:
We provide a non-asymptotic analysis of the contrastive divergence (CD) algorithm, a training method for unnormalized models. While prior work has established that (for exponential family distributions) the CD iterates asymptotically converge at an rate to the true parameter of the data distribution, we show that CD can achieve the parametric rate . Our analysis provides results for various data batching schemes, including fully online and minibatch. We additionally show that CD is near-optimal, in the sense that its asymptotic variance is close to the Cramér-Rao lower bound.
Chat is not available.