`

Timezone: »

 
Poster
Projected GANs Converge Faster
Axel Sauer · Kashyap Chitta · Jens Müller · Andreas Geiger

Tue Dec 07 08:30 AM -- 10:00 AM (PST) @

Generative Adversarial Networks (GANs) produce high-quality images but are challenging to train. They need careful regularization, vast amounts of compute, and expensive hyper-parameter sweeps. We make significant headway on these issues by projecting generated and real samples into a fixed, pretrained feature space. Motivated by the finding that the discriminator cannot fully exploit features from deeper layers of the pretrained model, we propose a more effective strategy that mixes features across channels and resolutions. Our Projected GAN improves image quality, sample efficiency, and convergence speed. It is further compatible with resolutions of up to one Megapixel and advances the state-of-the-art Fréchet Inception Distance (FID) on twenty-two benchmark datasets. Importantly, Projected GANs match the previously lowest FIDs up to 40 times faster, cutting the wall-clock time from 5 days to less than 3 hours given the same computational resources.

Author Information

Axel Sauer (University of Tübingen)
Kashyap Chitta (Max Planck Institute for Intelligent Systems)
Jens Müller (Heidelberg University)
Andreas Geiger (MPI Tübingen)

More from the Same Authors