Poster
SPIDER: Near-Optimal Non-Convex Optimization via Stochastic Path-Integrated Differential Estimator
Cong Fang · Chris Junchi Li · Zhouchen Lin · Tong Zhang
Room 210 #49
Keywords: [ Stochastic Methods ] [ Non-Convex Optimization ]
[
Abstract
]
Abstract:
In this paper, we propose a new technique named \textit{Stochastic Path-Integrated Differential EstimatoR} (SPIDER), which can be used to track many deterministic quantities of interests with significantly reduced computational cost.
Combining SPIDER with the method of normalized gradient descent, we propose SPIDER-SFO that solve non-convex stochastic optimization problems using stochastic gradients only.
We provide a few error-bound results on its convergence rates.
Specially, we prove that the SPIDER-SFO algorithm achieves a gradient computation cost of O(min(n1/2ϵ−2,ϵ−3)) to find an ϵ-approximate first-order stationary point.
In addition, we prove that SPIDER-SFO nearly matches the algorithmic lower bound for finding stationary point under the gradient Lipschitz assumption in the finite-sum setting.
Our SPIDER technique can be further applied to find an (ϵ,O(\ep0.5))-approximate second-order stationary point at a gradient computation cost of ˜O(min(n1/2ϵ−2+ϵ−2.5,ϵ−3)).
Live content is unavailable. Log in and register to view live content