We establish a connection between stochastic optimal control and generative models based on stochastic differential equations (SDEs) such as recently developed diffusion probabilistic models. In particular, we derive a Hamilton-Jacobi-Bellman equation that governs the evolution of the log-densities of the underlying SDE marginals. This perspective allows to transfer methods from optimal control theory to generative modeling. First, we show that the evidence lower bound is a direct consequence of the well-known verification theorem from control theory. Further, we develop a novel diffusion-based method for sampling from unnormalized densities -- a problem frequently occurring in statistics and computational sciences.
Julius Berner (University of Vienna)
Educated at University of Vienna (BSc, MSc) with a specialization in applied mathematics and scientific computing, I became very interested in machine learning and neural networks, in particular. Currently working towards a doctoral degree my research focuses on the mathematical analysis of deep learning based methods at the intersection of approximation theory, statistical learning theory, and optimization.
Lorenz Richter (FU Berlin)
Karen Ullrich (Meta AI)
More from the Same Authors
2020 Poster: Numerically Solving Parametric Families of High-Dimensional Kolmogorov Partial Differential Equations via Deep Learning »
Julius Berner · Markus Dablander · Philipp Grohs
2019 Poster: How degenerate is the parametrization of neural networks with the ReLU activation function? »
Dennis Maximilian Elbrächter · Julius Berner · Philipp Grohs