Skip to yearly menu bar Skip to main content


Poster

Gradient Estimation with Discrete Stein Operators

Jiaxin Shi · Yuhao Zhou · Jessica Hwang · Michalis Titsias · Lester Mackey

Hall J (level 1) #503

Keywords: [ VAE ] [ variance reduction ] [ Stein's method ] [ REINFORCE ] [ Gradient estimation ] [ control variates ] [ discrete latent variables ] [ score function ] [ Markov Chain ]

award Outstanding Paper
[ ]
[ Paper [ Poster [ OpenReview

Abstract:

Gradient estimation---approximating the gradient of an expectation with respect to the parameters of a distribution---is central to the solution of many machine learning problems. However, when the distribution is discrete, most common gradient estimators suffer from excessive variance. To improve the quality of gradient estimation, we introduce a variance reduction technique based on Stein operators for discrete distributions. We then use this technique to build flexible control variates for the REINFORCE leave-one-out estimator. Our control variates can be adapted online to minimize variance and do not require extra evaluations of the target function. In benchmark generative modeling tasks such as training binary variational autoencoders, our gradient estimator achieves substantially lower variance than state-of-the-art estimators with the same number of function evaluations.

Chat is not available.