Timezone: »

 
Private GANs, Revisited
Alex Bie · Gautam Kamath · Guojun Zhang

Fri Dec 02 08:46 AM -- 08:48 AM (PST) @
Event URL: https://openreview.net/forum?id=9RH0x167xSI »

We show that with improved training, the standard approach for differentially private GANs -- updating the discriminator with noisy gradients -- achieves or competes with state-of-the-art results for private image synthesis. Existing instantiations of this approach neglect to consider how adding noise only to discriminator updates disrupts the careful balance between generator and discriminator necessary for successful GAN training. We show that a simple fix restores parity: taking more discriminator steps between generator steps. Finally, with the goal of restoring parity between generator and discriminator, we experiment with further modifications to improve discriminator training and see further improvements in generation quality. For MNIST at ε = 10, our private GANs improve the record FID from 48.4 to 13.0, as well as downstream classifier accuracy from 83.2% to 95.0%.

Author Information

Alex Bie (University of Waterloo)
Gautam Kamath (University of Waterloo)
Guojun Zhang (University of Waterloo)

I am a third-year Ph.D. student in the David R. Cheriton School of Computer Science at the University of Waterloo and am also a student affiliate of the Vector Institute. My supervisors are Pascal Poupart and Yaoliang Yu. I am working on optimization problems in machine learning.

More from the Same Authors