Analysis of Schedule Free Nonconvex Optimization
Connor Brown · Ahmed Khaled · Chi Jin
Abstract
The Schedule-Free (SF) method promises optimal performance with hyperparameters that are independent of a known training time horizon \(T\). Nevertheless, nonconvex analysis of SF has been limited or reliant on strong global assumptions. Under minimal smoothness and lower-boundedness assumptions, we introduce a robust Lyapunov framework for analyzing SF in the nonconvex setting, yielding a family of horizon-agnostic $O(1/\log T)$, $O(\log T/T)$, and $O\bigl(T^{-(1-\alpha)}\bigr)$ rates under a variety of conditions. Our analysis --- complemented by Performance Estimation Problem (PEP) experiments --- extends SF’s horizon-free guarantees to smooth nonconvex optimization and charts future directions for optimal nonconvex rates.
Chat is not available.
Successful Page Load