Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop on Diffusion Models

The Emergence of Reproducibility and Consistency in Diffusion Models

Huijie Zhang · Jinfan Zhou · Yifu Lu · Minzhe Guo · Liyue Shen · Qing Qu

[ ] [ Project Page ]
 
presentation: NeurIPS 2023 Workshop on Diffusion Models
Fri 15 Dec 6:50 a.m. PST — 3:30 p.m. PST

Abstract:

In this work, we uncover a distinct and prevalent phenomenon within diffusion models in contrast to most other generative models, which we refer to as "consistent model reproducibility''. To elaborate, our extensive experiments have consistently shown that when starting with the same initial noise input and sampling with a deterministic solver, diffusion models tend to produce nearly identical output content. This consistency holds true regardless of the choices of model architectures and training procedures.Additionally, our research has unveiled that this exceptional model reproducibility manifests in two distinct training regimes: (i) "memorization regime,'' characterized by a significantly overparameterized model which attains reproducibility mainly by memorizing the training data; (ii) "generalization regime,'' in which the model is trained on an extensive dataset, and its reproducibility emerges with the model's generalization capabilities.

Chat is not available.