Timezone: »

 
Poster
Continuous-time Models for Stochastic Optimization Algorithms
Antonio Orvieto · Aurelien Lucchi

Thu Dec 12 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #208

We propose new continuous-time formulations for first-order stochastic optimization algorithms such as mini-batch gradient descent and variance-reduced methods. We exploit these continuous-time models, together with simple Lyapunov analysis as well as tools from stochastic calculus, in order to derive convergence bounds for various types of non-convex functions. Guided by such analysis, we show that the same Lyapunov arguments hold in discrete-time, leading to matching rates. In addition, we use these models and Ito calculus to infer novel insights on the dynamics of SGD, proving that a decreasing learning rate acts as time warping or, equivalently, as landscape stretching.

Author Information

Antonio Orvieto (ETH Zurich)

PhD Student at ETH Zurich. I’m interested in the design and analysis of optimization algorithms for deep learning. Interned at DeepMind, MILA, and Meta. All publications at http://orvi.altervista.org/ Looking for postdoc positions! :) antonio.orvieto@inf.ethz.ch

Aurelien Lucchi (ETH Zurich)

More from the Same Authors