Timezone: »
Poster
Optimistic Dual Extrapolation for Coherent Non-monotone Variational Inequalities
Chaobing Song · Zhengyuan Zhou · Yichao Zhou · Yong Jiang · Yi Ma
The optimization problems associated with training generative adversarial neural networks can be largely reduced to certain {\em non-monotone} variational inequality problems (VIPs), whereas existing convergence results are mostly based on monotone or strongly monotone assumptions. In this paper, we propose {\em optimistic dual extrapolation (OptDE)}, a method that only performs {\em one} gradient evaluation per iteration. We show that OptDE is provably convergent to {\em a strong solution} under different coherent non-monotone assumptions. In particular, when a {\em weak solution} exists, the convergence rate of our method is $O(1/{\epsilon^{2}})$, which matches the best existing result of the methods with two gradient evaluations. Further, when a {\em $\sigma$-weak solution} exists, the convergence guarantee is improved to the linear rate $O(\log\frac{1}{\epsilon})$. Along the way--as a byproduct of our inquiries into non-monotone variational inequalities--we provide the near-optimal $O\big(\frac{1}{\epsilon}\log \frac{1}{\epsilon}\big)$ convergence guarantee in terms of restricted strong merit function for monotone variational inequalities. We also show how our results can be naturally generalized to the stochastic setting, and obtain corresponding new convergence results. Taken together, our results contribute to the broad landscape of variational inequality--both non-monotone and monotone alike--by providing a novel and more practical algorithm with the state-of-the-art convergence guarantees.
Author Information
Chaobing Song (University of Wisconsin-Madison)
Zhengyuan Zhou (Stanford University)
Yichao Zhou (UC Berkeley)
Yong Jiang (Tsinghua)
Yi Ma (UC Berkeley)
More from the Same Authors
-
2020 Poster: Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization »
Chaobing Song · Yong Jiang · Yi Ma -
2020 Poster: Stochastic Deep Gaussian Processes over Graphs »
Naiqi Li · Wenjie Li · Jifeng Sun · Yinghua Gao · Yong Jiang · Shu-Tao Xia -
2020 Poster: Robust Recovery via Implicit Bias of Discrepant Learning Rates for Double Over-parameterization »
Chong You · Zhihui Zhu · Qing Qu · Yi Ma -
2020 Spotlight: Robust Recovery via Implicit Bias of Discrepant Learning Rates for Double Over-parameterization »
Chong You · Zhihui Zhu · Qing Qu · Yi Ma -
2020 Poster: Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction »
Yaodong Yu · Kwan Ho Ryan Chan · Chong You · Chaobing Song · Yi Ma -
2019 Poster: Learning in Generalized Linear Contextual Bandits with Stochastic Delays »
Zhengyuan Zhou · Renyuan Xu · Jose Blanchet -
2019 Spotlight: Learning in Generalized Linear Contextual Bandits with Stochastic Delays »
Zhengyuan Zhou · Renyuan Xu · Jose Blanchet -
2019 Poster: NeurVPS: Neural Vanishing Point Scanning via Conic Convolution »
Yichao Zhou · Haozhi Qi · Jingwei Huang · Yi Ma -
2019 Poster: Online EXP3 Learning in Adversarial Bandits with Delayed Feedback »
Ilai Bistritz · Zhengyuan Zhou · Xi Chen · Nicholas Bambos · Jose Blanchet -
2018 Poster: Learning in Games with Lossy Feedback »
Zhengyuan Zhou · Panayotis Mertikopoulos · Susan Athey · Nicholas Bambos · Peter W Glynn · Yinyu Ye -
2017 Poster: Countering Feedback Delays in Multi-Agent Learning »
Zhengyuan Zhou · Panayotis Mertikopoulos · Nicholas Bambos · Peter W Glynn · Claire Tomlin -
2017 Poster: Stochastic Mirror Descent in Variationally Coherent Optimization Problems »
Zhengyuan Zhou · Panayotis Mertikopoulos · Nicholas Bambos · Stephen Boyd · Peter W Glynn