Timezone: »

Relaxed-Responsibility Hierarchical Discrete VAEs
Matthew Willetts · Xenia Miscouridou · Stephen J Roberts · Chris C Holmes
Event URL: https://openreview.net/forum?id=_O7HWIFPxJ3 »

Successfully training Variational Autoencoders (VAEs) with a hierarchy of discrete latent variables remains an area of active research. Vector-Quantised VAEs are a powerful approach to discrete VAEs, but naive hierarchical extensions can be unstable when training. Leveraging insights from classical methods of inference we introduce Relaxed-Responsibility Vector-Quantisation, a novel way to parameterise discrete latent variables, a refinement of relaxed Vector-Quantisation that gives better performance and more stable training. This enables a novel approach to hierarchical discrete variational autoencoders with numerous layers of latent variables (here up to 32) that we train end-to-end. Within hierarchical probabilistic deep generative models with discrete latent variables trained end-to-end, we achieve state-of-the-art bits-per-dim results for various standard datasets.

Author Information

Matthew Willetts (University College London)
Xenia Miscouridou (University of Oxford)
Stephen J Roberts (University of Oxford)
Chris C Holmes (University of Oxford)

More from the Same Authors