Poster

UniGAN: Reducing Mode Collapse in GANs using a Uniform Generator

Ziqi Pan · Li Niu · Liqing Zhang

Hall J #222

Keywords: [ GANs ] [ Mode Collapse ]

[ Abstract ]
[ Poster [ OpenReview
Wed 30 Nov 2 p.m. PST — 4 p.m. PST
 
Spotlight presentation: Lightning Talks 1B-4
Tue 6 Dec 10:30 a.m. PST — 10:45 a.m. PST

Abstract: Despite the significant progress that has been made in the training of Generative Adversarial Networks (GANs), the mode collapse problem remains a major challenge in training GANs, which refers to a lack of diversity in generative samples. In this paper, we propose a new type of generative diversity named uniform diversity, which relates to a newly proposed type of mode collapse named $u$-mode collapse where the generative samples distribute nonuniformly over the data manifold. From a geometric perspective, we show that the uniform diversity is closely related with the generator uniformity property, and the maximum uniform diversity is achieved if the generator is uniform. To learn a uniform generator, we propose UniGAN, a generative framework with a Normalizing Flow based generator and a simple yet sample efficient generator uniformity regularization, which can be easily adapted to any other generative framework. A new type of diversity metric named udiv is also proposed to estimate the uniform diversity given a set of generative samples in practice. Experimental results verify the effectiveness of our UniGAN in learning a uniform generator and improving uniform diversity.

Chat is not available.