Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop on Diffusion Models

TADA: Timestep-Aware Data Augmentation for Diffusion Models

NaHyeon Park · Kunhee Kim · Song Park · Jung-Woo Ha · Hyunjung Shim


Abstract:

Simply applying augmentation techniques to generative models can lead to a distribution shift problem, producing unintended augmented-like output samples. While this issue has been actively studied in generative adversarial networks (GANs), little attention has been paid to diffusion models despite their widespread use. In this work, we conduct the first comprehensive study of data augmentation for diffusion models, primarily investigating the relationship between distribution shifts and data augmentation. Our study reveals that distribution shifts in diffusion models originate exclusively from specific timestep intervals, rather than from the entire timesteps. Based on these findings, we introduce a simple yet effective data augmentation strategy that flexibly adjusts the augmentation strength depending on timesteps. Experiments demonstrate that our simple data augmentation pipeline can improve the generation quality of diffusion models, especially in data-limited settings. We expect that our data augmentation method can benefit various diffusion model designs and tasks across a wide scope of applications.

Chat is not available.