Timezone: »

Dynamical mean-field theory for stochastic gradient descent in Gaussian mixture classification
Francesca Mignacco · Florent Krzakala · Pierfrancesco Urbani · Lenka Zdeborová

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #473

We analyze in a closed form the learning dynamics of stochastic gradient descent (SGD) for a single layer neural network classifying a high-dimensional Gaussian mixture where each cluster is assigned one of two labels. This problem provides a prototype of a non-convex loss landscape with interpolating regimes and a large generalization gap. We define a particular stochastic process for which SGD can be extended to a continuous-time limit that we call stochastic gradient flow. In the full-batch limit we recover the standard gradient flow. We apply dynamical mean-field theory from statistical physics to track the dynamics of the algorithm in the high-dimensional limit via a self-consistent stochastic process. We explore the performance of the algorithm as a function of control parameters shedding light on how it navigates the loss landscape.

Author Information

Francesca Mignacco (IPhT, CEA Saclay)
Florent Krzakala (ENS Paris, Sorbonnes Université & EPFL)
Pierfrancesco Urbani (Institut de Physique Théorique)
Lenka Zdeborová (EPFL)

More from the Same Authors