Timezone: »

 
Poster
Implicit Reparameterization Gradients
Mikhail Figurnov · Shakir Mohamed · Andriy Mnih

Thu Dec 06 07:45 AM -- 09:45 AM (PST) @ Room 210 #33

By providing a simple and efficient way of computing low-variance gradients of continuous random variables, the reparameterization trick has become the technique of choice for training a variety of latent variable models. However, it is not applicable to a number of important continuous distributions. We introduce an alternative approach to computing reparameterization gradients based on implicit differentiation and demonstrate its broader applicability by applying it to Gamma, Beta, Dirichlet, and von Mises distributions, which cannot be used with the classic reparameterization trick. Our experiments show that the proposed approach is faster and more accurate than the existing gradient estimators for these distributions.

Author Information

Mikhail Figurnov (DeepMind)
Shakir Mohamed (DeepMind)
Shakir Mohamed

Shakir Mohamed is a senior staff scientist at DeepMind in London. Shakir's main interests lie at the intersection of approximate Bayesian inference, deep learning and reinforcement learning, and the role that machine learning systems at this intersection have in the development of more intelligent and general-purpose learning systems. Before moving to London, Shakir held a Junior Research Fellowship from the Canadian Institute for Advanced Research (CIFAR), based in Vancouver at the University of British Columbia with Nando de Freitas. Shakir completed his PhD with Zoubin Ghahramani at the University of Cambridge, where he was a Commonwealth Scholar to the United Kingdom. Shakir is from South Africa and completed his previous degrees in Electrical and Information Engineering at the University of the Witwatersrand, Johannesburg.

Andriy Mnih (DeepMind)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors