Timezone: »
Modelers use automatic differentiation (AD) of computation graphs to implement complex Deep Learning models without defining gradient computations. Stochastic AD extends AD to stochastic computation graphs with sampling steps, which arise when modelers handle the intractable expectations common in Reinforcement Learning and Variational Inference. However, current methods for stochastic AD are limited: They are either only applicable to continuous random variables and differentiable functions, or can only use simple but high variance score-function estimators. To overcome these limitations, we introduce Storchastic, a new framework for AD of stochastic computation graphs. Storchastic allows the modeler to choose from a wide variety of gradient estimation methods at each sampling step, to optimally reduce the variance of the gradient estimates. Furthermore, Storchastic is provably unbiased for estimation of any-order gradients, and generalizes variance reduction techniques to higher-order gradient estimates. Finally, we implement Storchastic as a PyTorch library at github.com/HEmile/storchastic.
Author Information
Emile van Krieken (Vrije Universiteit Amsterdam)
Jakub Tomczak (Vrije Universiteit Amsterdam)
Annette Ten Teije (Vrije Universiteit Amsterdam)
More from the Same Authors
-
2021 : Semi-supervised Multiple Instance Learning using Variational Auto-Encoders »
Ali Nihat Uzunalioglu · Tameem Adel · Jakub M. Tomczak -
2021 : Semi-supervised Multiple Instance Learning using Variational Auto-Encoders »
Ali Nihat Uzunalioglu · Tameem Adel · Jakub M. Tomczak -
2022 : Kendall Shape-VAE : Learning Shapes in a Generative Framework »
Sharvaree Vadgama · Jakub Tomczak · Erik Bekkers -
2022 Spotlight: Alleviating Adversarial Attacks on Variational Autoencoders with MCMC »
Anna Kuzina · Max Welling · Jakub Tomczak -
2022 : Kendall Shape-VAE : Learning Shapes in a Generative Framework »
Sharvaree Vadgama · Jakub Tomczak · Erik Bekkers -
2022 Poster: Alleviating Adversarial Attacks on Variational Autoencoders with MCMC »
Anna Kuzina · Max Welling · Jakub Tomczak -
2022 Poster: On Analyzing Generative and Denoising Capabilities of Diffusion-based Deep Generative Models »
Kamil Deja · Anna Kuzina · Tomasz Trzcinski · Jakub Tomczak -
2021 Poster: Invertible DenseNets with Concatenated LipSwish »
Yura Perugachi-Diaz · Jakub Tomczak · Sandjai Bhulai -
2020 Poster: The Convolution Exponential and Generalized Sylvester Flows »
Emiel Hoogeboom · Victor Garcia Satorras · Jakub Tomczak · Max Welling -
2019 Poster: Combinatorial Bayesian Optimization using the Graph Cartesian Product »
Changyong Oh · Jakub Tomczak · Stratis Gavves · Max Welling