Timezone: »
The sampling of probability distributions specified up to a normalization constant is an important problem in both machine learning and statistical mechanics. While classical stochastic sampling methods such as Markov Chain Monte Carlo (MCMC) or Langevin Dynamics (LD) can suffer from slow mixing times there is a growing interest in using normalizing flows in order to learn the transformation of a simple prior distribution to the given target distribution. Here we propose a generalized and combined approach to sample target densities: Stochastic Normalizing Flows (SNF) – an arbitrary sequence of deterministic invertible functions and stochastic sampling blocks. We show that stochasticity overcomes expressivity limitations of normalizing flows resulting from the invertibility constraint, whereas trainable transformations between sampling steps improve efficiency of pure MCMC/LD along the flow. By invoking ideas from non-equilibrium statistical mechanics we derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end, and by which we can compute exact importance weights without having to marginalize out the randomness of the stochastic blocks. We illustrate the representational power, sampling efficiency and asymptotic correctness of SNFs on several benchmarks including applications to sampling molecular systems in equilibrium.
Author Information
Hao Wu (Freie Universität Berlin)
Jonas Köhler (Freie Universität Berlin)
Frank Noe (FU Berlin)
Related Events (a corresponding poster, oral, or spotlight)
-
2020 Spotlight: Stochastic Normalizing Flows »
Thu. Dec 10th 04:00 -- 04:10 PM Room Orals & Spotlights: Unsupervised/Probabilistic
More from the Same Authors
-
2022 : Representation Learning on Biomolecular Structures using Equivariant Graph Attention »
Tuan Le · Frank Noe · Djork-Arné Clevert -
2022 Poster: Unsupervised Learning of Group Invariant and Equivariant Representations »
Robin Winter · Marco Bertolini · Tuan Le · Frank Noe · Djork-Arné Clevert -
2021 : Differentiable Programming in Molecular Physics »
Frank Noe -
2021 Poster: Smooth Normalizing Flows »
Jonas Köhler · Andreas Krämer · Frank Noe -
2021 Poster: Permutation-Invariant Variational Autoencoder for Graph-Level Representation Learning »
Robin Winter · Frank Noe · Djork-Arné Clevert -
2020 : Invited Talk - Frank Noe: Deep Markov State Models versus Covid-19 »
Frank Noe -
2020 : Frank Noé - PauliNet: Deep Neural Network Solution of the Electronic Schrödinger Equation »
Frank Noe -
2020 : Invited Talk: Frank Noe - The sampling problem in statistical mechanics and Boltzmann-Generating Flows »
Frank Noe -
2018 : Invited Talk Session 1 »
Frank Noe -
2018 Poster: Deep Generative Markov State Models »
Hao Wu · Andreas Mardt · Luca Pasquali · Frank Noe -
2016 Poster: Spectral Learning of Dynamic Systems from Nonequilibrium Data »
Hao Wu · Frank Noe