Skip to yearly menu bar Skip to main content


Poster

Energy-Inspired Models: Learning with Sampler-Induced Distributions

Dieterich Lawson · George Tucker · Bo Dai · Rajesh Ranganath

East Exhibition Hall B, C #120

Keywords: [ Generative Models ] [ Deep Learning ] [ Probabilistic Methods ] [ Variational Inference ]


Abstract:

Energy-based models (EBMs) are powerful probabilistic models, but suffer from intractable sampling and density evaluation due to the partition function. As a result, inference in EBMs relies on approximate sampling algorithms, leading to a mismatch between the model and inference. Motivated by this, we consider the sampler-induced distribution as the model of interest and maximize the likelihood of this model. This yields a class of energy-inspired models (EIMs) that incorporate learned energy functions while still providing exact samples and tractable log-likelihood lower bounds. We describe and evaluate three instantiations of such models based on truncated rejection sampling, self-normalized importance sampling, and Hamiltonian importance sampling. These models out-perform or perform comparably to the recently proposed Learned Accept/RejectSampling algorithm and provide new insights on ranking Noise Contrastive Estimation and Contrastive Predictive Coding. Moreover, EIMs allow us to generalize a recent connection between multi-sample variational lower bounds and auxiliary variable variational inference. We show how recent variational bounds can be unified with EIMs as the variational family.

Live content is unavailable. Log in and register to view live content