Skip to yearly menu bar Skip to main content


Poster

Loss Symmetry and Noise Equilibrium of Stochastic Gradient Descent

Liu Ziyin · Mingze Wang · Hongchao Li · Lei Wu

East Exhibit Hall A-C #4801
[ ]
Thu 12 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract: Symmetries exist abundantly in the loss function of neural networks. We characterize the learning dynamics of stochastic gradient descent (SGD) when exponential symmetries, a broad subclass of continuous symmetries, exist in the loss function. We establish that when gradient noises do not balance, SGD has the tendency to move the model parameters toward a point where noises from different directions are balanced. Here, a special type of fixed point in the constant directions of the loss function emerges as a candidate for solutions for SGD. As the main theoretical result, we prove that every parameter $\theta$ connects without loss function barrier to a unique noise-balanced fixed point $\theta^*$. The theory implies that the balancing of gradient noise can serve as a novel alternative mechanism for relevant phenomena such as progressive sharpening and flattening and can be applied to understand common practical problems such as representation normalization, matrix factorization, warmup, and formation of latent representations.

Live content is unavailable. Log in and register to view live content