Timezone: »

Acceleration and Averaging in Stochastic Descent Dynamics
Walid Krichene · Peter Bartlett

Tue Dec 05 11:50 AM -- 11:55 AM (PST) @ Hall C

We formulate and study a general family of (continuous-time) stochastic dynamics for accelerated first-order minimization of smooth convex functions. Building on an averaging formulation of accelerated mirror descent, we propose a stochastic variant in which the gradient is contaminated by noise, and study the resulting stochastic differential equation. We prove a bound on the rate of change of an energy function associated to the problem, then use it to derive estimates of convergence rates of the function values, (a.s. and in expectation) both for persistent and asymptotically vanishing noise. We discuss the interaction between the parameters of the dynamics (learning rate and averaging weights) and the co-variation of the noise process, and show, in particular, how the asymptotic rate of co-variation affects the choice of parameters and, ultimately, the convergence rate.

Author Information

Walid Krichene (Google)
Peter Bartlett (UC Berkeley)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors