Timezone: »

 
Poster
Adaptive SVRG Methods under Error Bound Conditions with Unknown Growth Parameter
Yi Xu · Qihang Lin · Tianbao Yang

Tue Dec 05 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #169 #None

Error bound, an inherent property of an optimization problem, has recently revived in the development of algorithms with improved global convergence without strong convexity. The most studied error bound is the quadratic error bound, which generalizes strong convexity and is satisfied by a large family of machine learning problems. Quadratic error bound have been leveraged to achieve linear convergence in many first-order methods including the stochastic variance reduced gradient (SVRG) method, which is one of the most important stochastic optimization methods in machine learning. However, the studies along this direction face the critical issue that the algorithms must depend on an unknown growth parameter (a generalization of strong convexity modulus) in the error bound. This parameter is difficult to estimate exactly and the algorithms choosing this parameter heuristically do not have theoretical convergence guarantee. To address this issue, we propose novel SVRG methods that automatically search for this unknown parameter on the fly of optimization while still obtain almost the same convergence rate as when this parameter is known. We also analyze the convergence property of SVRG methods under H\"{o}lderian error bound, which generalizes the quadratic error bound.

Author Information

Yi Xu (The University of Iowa)
Qihang Lin (University of Iowa)
Tianbao Yang (The University of Iowa)

More from the Same Authors