Skip to yearly menu bar Skip to main content


Poster

DVAE#: Discrete Variational Autoencoders with Relaxed Boltzmann Priors

Arash Vahdat · Evgeny Andriyash · William Macready

Room 210 #6

Keywords: [ Latent Variable Models ] [ Generative Models ] [ Deep Autoencoders ] [ Variational Inference ]


Abstract:

Boltzmann machines are powerful distributions that have been shown to be an effective prior over binary latent variables in variational autoencoders (VAEs). However, previous methods for training discrete VAEs have used the evidence lower bound and not the tighter importance-weighted bound. We propose two approaches for relaxing Boltzmann machines to continuous distributions that permit training with importance-weighted bounds. These relaxations are based on generalized overlapping transformations and the Gaussian integral trick. Experiments on the MNIST and OMNIGLOT datasets show that these relaxations outperform previous discrete VAEs with Boltzmann priors. An implementation which reproduces these results is available.

Live content is unavailable. Log in and register to view live content