Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Synthetic Data for Empowering ML Research

Generic and Privacy-free Synthetic Data Generation for Pretraining GANs

Kyungjune Baek · Hyunjung Shim


Abstract:

Transfer learning for GANs successfully improves low-shot generation performance. However, existing studies show that the pretrained model using a single benchmark dataset is not generalized to various datasets. More importantly, the pretrained model can be vulnerable to copyright or privacy risks. To resolve both issues, we propose an effective and unbiased data synthesizer, namely Primitives-PS, inspired by the generic characteristics of natural images. Since Primitives-PS only considers the generic properties of natural images, the images are free from copyright and privacy issues. In addition, the single model pretrained on our dataset can be transferred to various target datasets. Extensive analysis demonstrates that each component of our data synthesizer is effective, and provides insights on the desirable nature of the pretrained model for the transferability of GANs.

Chat is not available.