Skip to yearly menu bar Skip to main content

Workshop: Synthetic Data for Empowering ML Research

Mind Your Step: Continuous Conditional GANs with Generator Regularization

Yunkai Zhang · Yufeng Zheng · Amber Ma · Siyuan Teng · Zeyu Zheng


Conditional Generative Adversarial Networks are known to be difficult to train, especially when the conditions are continuous and high-dimensional. To partially alleviate this difficulty, we propose a simple generator regularization term on the GAN generator loss in the form of a Lipschitz penalty. The intuition of this Lipschitz penalty is that, when the generator is fed with neighboring conditions in the continuous space, the regularization term will leverage the neighbor information and push the generator to generate samples that have similar conditional distributions for neighboring conditions. We analyze the effect of the proposed regularization term and demonstrate its robust performance on a range of synthetic tasks as well as real-world conditional time series generation tasks.

Chat is not available.