Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop on Diffusion Models

Diffusion Models With Learned Adaptive Noise Processes

Subham Sahoo · Aaron Gokaslan · Christopher De Sa · Volodymyr Kuleshov


Abstract:

Diffusion models have gained traction as powerful algorithms for synthesizing high-quality images. Central to these algorithms is the diffusion process, which maps data to noise according to equations inspired by thermodynamics, and which can significantly impact performance.In this work, we explore whether a diffusion process can be learned from data.We propose multivariate learned adaptive noise (MuLAN), a learned diffusion process that applies Gaussian noise at different rates across an image.Our method consists of three components — a multivariate noise schedule, instance-conditional diffusion, and auxiliary variables — which ensure that the learning objective is no longer invariant to the choice of noise schedule as in previous works.Our work is grounded in Bayesian inference and casts the learned diffusion process as an approximate variational posterior that yields a tighter lower bound on the marginal likelihood.Empirically, MuLAN significantly improves likelihood estimation on CIFAR10 and ImageNet, and achieves 2x faster convergence to state-of-the-art performance compared to classical diffusion.

Chat is not available.