Timezone: »

Adversarial Symmetric Variational Autoencoder
Yuchen Pu · Weiyao Wang · Ricardo Henao · Liqun Chen · Zhe Gan · Chunyuan Li · Lawrence Carin

Mon Dec 04 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #109

A new form of variational autoencoder (VAE) is developed, in which the joint distribution of data and codes is considered in two (symmetric) forms: (i) from observed data fed through the encoder to yield codes, and (ii) from latent codes drawn from a simple prior and propagated through the decoder to manifest data. Lower bounds are learned for marginal log-likelihood fits observed data and latent codes. When learning with the variational bound, one seeks to minimize the symmetric Kullback-Leibler divergence of joint density functions from (i) and (ii), while simultaneously seeking to maximize the two marginal log-likelihoods. To facilitate learning, a new form of adversarial training is developed. An extensive set of experiments is performed, in which we demonstrate state-of-the-art data reconstruction and generation on several image benchmarks datasets.

Author Information

Yuchen Pu (Duke University)
Weiyao Wang (Duke University)
Ricardo Henao (Duke University)
Liqun Chen (Duke University)
Zhe Gan (Duke University)
Chunyuan Li (Duke University)

Chunyuan is a PhD student at Duke University, affiliated with department of Electrical and Computer Engineering, advised by Prof. Lawrence Carin. His recent research interests focus on scalable Bayesian methods for deep learning, including generative models and reinforcement learning, with applications to computer vision and natural language processing.

Lawrence Carin (Duke University)

More from the Same Authors