Timezone: »
Oral
Oracle Complexity in Nonsmooth Nonconvex Optimization
Guy Kornowski · Ohad Shamir
It is wellknown that given a smooth, boundedfrombelow, and possibly nonconvex function, standard gradientbased methods can find $\epsilon$stationary points (with gradient norm less than $\epsilon$) in $\mathcal{O}(1/\epsilon^2)$ iterations. However, many important nonconvex optimization problems, such as those associated with training modern neural networks, are inherently not smooth, making these results inapplicable. In this paper, we study nonsmooth nonconvex optimization from an oracle complexity viewpoint, where the algorithm is assumed to be given access only to local information about the function at various points. We provide two main results (under mild assumptions): First, we consider the problem of getting \emph{near} $\epsilon$stationary points. This is perhaps the most natural relaxation of \emph{finding} $\epsilon$stationary points, which is impossible in the nonsmooth nonconvex case. We prove that this relaxed goal cannot be achieved efficiently, for any distance and $\epsilon$ smaller than some constants. Our second result deals with the possibility of tackling nonsmooth nonconvex optimization by reduction to smooth optimization: Namely, applying smooth optimization methods on a smooth approximation of the objective function. For this approach, we prove an inherent tradeoff between oracle complexity and smoothness: On the one hand, smoothing a nonsmooth nonconvex function can be done very efficiently (e.g., by randomized smoothing), but with dimensiondependent factors in the smoothness parameter, which can strongly affect iteration complexity when plugging into standard smooth optimization methods. On the other hand, these dimension factors can be eliminated with suitable smoothing methods, but only by making the oracle complexity of the smoothing process exponentially large.
Author Information
Guy Kornowski (Weizmann Institute of Science)
Ohad Shamir (Weizmann Institute of Science)
Related Events (a corresponding poster, oral, or spotlight)

2021 Poster: Oracle Complexity in Nonsmooth Nonconvex Optimization »
Thu Dec 9th 08:30  10:00 AM Room None
More from the Same Authors

2021 Spotlight: Random Shuffling Beats SGD Only After Many Epochs on IllConditioned Problems »
Itay Safran · Ohad Shamir 
2021 Poster: Learning a Single Neuron with Bias Using Gradient Descent »
Gal Vardi · Gilad Yehudai · Ohad Shamir 
2021 Poster: A Stochastic Newton Algorithm for Distributed Convex Optimization »
Brian Bullins · Kshitij Patel · Ohad Shamir · Nathan Srebro · Blake Woodworth 
2021 Poster: Random Shuffling Beats SGD Only After Many Epochs on IllConditioned Problems »
Itay Safran · Ohad Shamir 
2020 : Poster Session 1 (gather.town) »
Laurent Condat · Tiffany Vlaar · Ohad Shamir · Mohammadi Zaki · Zhize Li · GuanHorng Liu · Samuel HorvĂˇth · Mher Safaryan · Yoni Choukroun · Kumar Shridhar · Nabil Kahale · Jikai Jin · Pratik Kumar Jawanpuria · Gaurav Kumar Yadav · Kazuki Koyama · Junyoung Kim · Xiao Li · Saugata Purkayastha · Adil Salim · Dighanchal Banerjee · Peter Richtarik · Lakshman Mahto · Tian Ye · Bamdev Mishra · Huikang Liu · Jiajie Zhu 
2020 : Contributed talks in Session 1 (Zoom) »
Sebastian Stich · Laurent Condat · Zhize Li · Ohad Shamir · Tiffany Vlaar · Mohammadi Zaki 
2020 : Contributed Video: Can We Find NearApproximatelyStationary Points of Nonsmooth Nonconvex Functions?, Ohad Shamir »
Ohad Shamir 
2020 Poster: Neural Networks with Small Weights and DepthSeparation Barriers »
Gal Vardi · Ohad Shamir 
2019 Poster: On the Power and Limitations of Random Features for Understanding Neural Networks »
Gilad Yehudai · Ohad Shamir 
2018 Poster: Are ResNets Provably Better than Linear Predictors? »
Ohad Shamir 
2018 Poster: Global Nonconvex Optimization with Discretized Diffusions »
Murat Erdogdu · Lester Mackey · Ohad Shamir 
2016 Poster: DimensionFree Iteration Complexity of Finite Sum Optimization Problems »
Yossi Arjevani · Ohad Shamir 
2016 Poster: WithoutReplacement Sampling for Stochastic Gradient Methods »
Ohad Shamir 
2016 Oral: WithoutReplacement Sampling for Stochastic Gradient Methods »
Ohad Shamir 
2015 Poster: Communication Complexity of Distributed Convex Learning and Optimization »
Yossi Arjevani · Ohad Shamir 
2014 Poster: Fundamental Limits of Online and Distributed Algorithms for Statistical Learning and Estimation »
Ohad Shamir 
2014 Poster: On the Computational Efficiency of Training Neural Networks »
Roi Livni · Shai ShalevShwartz · Ohad Shamir 
2013 Poster: Online Learning with Switching Costs and Other Adaptive Adversaries »
NicolĂ˛ CesaBianchi · Ofer Dekel · Ohad Shamir