Timezone: »

 
Poster
Tight Dimension Independent Lower Bound on the Expected Convergence Rate for Diminishing Step Sizes in SGD
PHUONG_HA NGUYEN · Lam Nguyen · Marten van Dijk

Tue Dec 10 05:30 PM -- 07:30 PM (PST) @ East Exhibition Hall B + C #114
We study the convergence of Stochastic Gradient Descent (SGD) for strongly convex objective functions. We prove for all $t$ a lower bound on the expected convergence rate after the $t$-th SGD iteration; the lower bound is over all possible sequences of diminishing step sizes. It implies that recently proposed sequences of step sizes at ICML 2018 and ICML 2019 are {\em universally} close to optimal in that the expected convergence rate after {\em each} iteration is within a factor $32$ of our lower bound. This factor is independent of dimension $d$. We offer a framework for comparing with lower bounds in state-of-the-art literature and when applied to SGD for strongly convex objective functions our lower bound is a significant factor $775\cdot d$ larger compared to existing work.

Author Information

PHUONG_HA NGUYEN (University of Connecticut (UCONN))
Lam Nguyen (IBM Research, Thomas J. Watson Research Center)
Marten van Dijk (University of Connecticut)