Convergence of Clipped SGD on Convex $(L_0,L_1)$-Smooth Functions
Ofir Gaash · Kfir Y. Levy · Yair Carmon
Abstract
We study stochastic gradient descent (SGD) with gradient clipping on convex functions under a generalized smoothness assumption called $(L_0,L_1)$-smoothness. Using gradient clipping, we establish a high probability convergence rate that matches the SGD rate in the $L$ smooth case up to polylogarithmic factors and additive terms. We also propose a variation of adaptive SGD with gradient clipping, which achieves the same guarantee. We perform empirical experiments to examine our theory and algorithmic choices.
Video
Chat is not available.
Successful Page Load