Timezone: »
Poster
Stochastic Halpern Iteration with Variance Reduction for Stochastic Monotone Inclusions
Xufeng Cai · Chaobing Song · Cristóbal Guzmán · Jelena Diakonikolas
We study stochastic monotone inclusion problems, which widely appear in machine learning applications, including robust regression and adversarial learning. We propose novel variants of stochastic Halpern iteration with recursive variance reduction. In the cocoercive---and more generally Lipschitz-monotone---setup, our algorithm attains $\epsilon$ norm of the operator with $\mathcal{O}(\frac{1}{\epsilon^3})$ stochastic operator evaluations, which significantly improves over state of the art $\mathcal{O}(\frac{1}{\epsilon^4})$ stochastic operator evaluations required for existing monotone inclusion solvers applied to the same problem classes. We further show how to couple one of the proposed variants of stochastic Halpern iteration with a scheduled restart scheme to solve stochastic monotone inclusion problems with ${\mathcal{O}}(\frac{\log(1/\epsilon)}{\epsilon^2})$ stochastic operator evaluations under additional sharpness or strong monotonicity assumptions. Finally, we argue via reductions between different problem classes that our stochastic oracle complexity bounds are tight up to logarithmic factors in terms of their $\epsilon$-dependence.
Author Information
Xufeng Cai (University of Wisconsin-Madison)
Chaobing Song (University of Wisconsin-Madison)
Cristóbal Guzmán (PUC-Chile)
Jelena Diakonikolas (University of Wisconsin-Madison)
More from the Same Authors
-
2022 : Contributed Talks 3 »
Cristóbal Guzmán · Fangshuo Liao · Vishwak Srinivasan · Zhiyuan Li -
2022 Workshop: OPT 2022: Optimization for Machine Learning »
Courtney Paquette · Sebastian Stich · Quanquan Gu · Cristóbal Guzmán · John Duchi -
2022 Workshop: Order up! The Benefits of Higher-Order Optimization in Machine Learning »
Albert Berahas · Jelena Diakonikolas · Jarad Forristal · Brandon Reese · Martin Takac · Yan Xu -
2022 Poster: A Fast Scale-Invariant Algorithm for Non-negative Least Squares with Non-negative Data »
Jelena Diakonikolas · Chenghui Li · Swati Padmanabhan · Chaobing Song -
2022 Poster: Differentially Private Generalized Linear Models Revisited »
Raman Arora · Raef Bassily · Cristóbal Guzmán · Michael Menart · Enayat Ullah -
2022 Poster: Coordinate Linear Variance Reduction for Generalized Linear Programming »
Chaobing Song · Cheuk Yin Lin · Stephen Wright · Jelena Diakonikolas -
2022 Poster: Between Stochastic and Adversarial Online Convex Optimization: Improved Regret Bounds via Smoothness »
Sarah Sachs · Hedi Hadiji · Tim van Erven · Cristóbal Guzmán -
2021 : Q&A with Cristóbal Guzmán »
Cristóbal Guzmán -
2021 : Non-Euclidean Differentially Private Stochastic Convex Optimization, Cristóbal Guzmán »
Cristóbal Guzmán -
2021 Poster: Best-case lower bounds in online learning »
Cristóbal Guzmán · Nishant Mehta · Ali Mortazavi -
2021 Poster: Differentially Private Stochastic Optimization: New Results in Convex and Non-Convex Settings »
Raef Bassily · Cristóbal Guzmán · Michael Menart -
2020 Poster: Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization »
Chaobing Song · Yong Jiang · Yi Ma -
2020 Poster: Optimistic Dual Extrapolation for Coherent Non-monotone Variational Inequalities »
Chaobing Song · Zhengyuan Zhou · Yichao Zhou · Yong Jiang · Yi Ma -
2020 Poster: Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses »
Raef Bassily · Vitaly Feldman · Cristóbal Guzmán · Kunal Talwar -
2020 Spotlight: Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses »
Raef Bassily · Vitaly Feldman · Cristóbal Guzmán · Kunal Talwar -
2020 Poster: Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction »
Yaodong Yu · Kwan Ho Ryan Chan · Chong You · Chaobing Song · Yi Ma