Timezone: »
Poster
Convergence for score-based generative modeling with polynomial complexity
Holden Lee · Jianfeng Lu · Yixin Tan
Score-based generative modeling (SGM) is a highly successful approach for learning a probability distribution from data and generating further samples. We prove the first polynomial convergence guarantees for the core mechanic behind SGM: drawing samples from a probability density $p$ given a score estimate (an estimate of $\nabla \ln p$) that is accurate in $L^2(p)$. Compared to previous works, we do not incur error that grows exponentially in time or that suffers from a curse of dimensionality. Our guarantee works for any smooth distribution and depends polynomially on its log-Sobolev constant. Using our guarantee, we give a theoretical analysis of score-based generative modeling, which transforms white-noise input into samples from a learned data distribution given score estimates at different noise scales. Our analysis gives theoretical grounding to the observation that an annealed procedure is required in practice to generate good samples, as our proof depends essentially on using annealing to obtain a warm start at each step. Moreover, we show that a predictor-corrector algorithm gives better convergence than using either portion alone.
Author Information
Holden Lee (Princeton University)
Jianfeng Lu (Duke University)
Yixin Tan (Duke University)
More from the Same Authors
-
2021 Spotlight: On the Representation of Solutions to Elliptic PDEs in Barron Spaces »
Ziang Chen · Jianfeng Lu · Yulong Lu -
2022 : Convergence of score-based generative modeling for general data distributions »
Holden Lee · Jianfeng Lu · Yixin Tan -
2022 Panel: Panel 5B-1: Convergence for score-based… & Learning (Very) Simple… »
Sitan Chen · Yixin Tan -
2021 : Statistical Numerical PDE : Fast Rate, Neural Scaling Law and When it’s Optimal »
Yiping Lu · Haoxuan Chen · Jianfeng Lu · Lexing Ying · Jose Blanchet -
2021 Poster: On the Representation of Solutions to Elliptic PDEs in Barron Spaces »
Ziang Chen · Jianfeng Lu · Yulong Lu -
2020 Poster: A Universal Approximation Theorem of Deep Neural Networks for Expressing Probability Distributions »
Yulong Lu · Jianfeng Lu -
2019 Poster: Online sampling from log-concave distributions »
Holden Lee · Oren Mangoubi · Nisheeth Vishnoi