Optimizing Conditional Input of Denoising Score Matching is Biased Towards Minimizing Variance
Tongda Xu
Abstract
Many recent works utilize denoising score matching to optimize the input or condition of diffusion models. In this workshop paper, we demonstrate that such optimization breaks the equivalence between denoising score matching and exact score matching. Furthermore, we show that this bias leads to the minimization of the conditional variance. Additionally, we observe a similar bias when optimizing the data distribution using a pre-trained diffusion model. Finally, we discuss the wide range of works across different domains that are affected by this bias, including MAR for auto-regressive generation \citep{li2024autoregressive}, PerCo \citep{careil2023towards} for image compression, and DreamFusion \citep{poole2022dreamfusion}.
Chat is not available.
Successful Page Load