Annihilation of Spurious Minima in Two-Layer ReLU Networks

Yossi Arjevani · Michael Field

Hall J #319

Keywords: [ saddles ] [ Symmetry ] [ Neural Networks ] [ Optimization ] [ ReLU ] [ spurious minima ] [ two layers ] [ symmetry breaking ] [ bad local minima ]

[ Abstract ]
[ Paper [ Poster [ OpenReview
Thu 1 Dec 9 a.m. PST — 11 a.m. PST
Spotlight presentation: Lightning Talks 4B-2
Wed 7 Dec 5:30 p.m. PST — 5:45 p.m. PST


We study the optimization problem associated with fitting two-layer ReLU neural networks with respect to the squared loss, where labels are generated by a target network. Use is made of the rich symmetry structure to develop a novel set of tools for studying the mechanism by which over-parameterization annihilates spurious minima through. Sharp analytic estimates are obtained for the loss and the Hessian spectrum at different minima, and it is shown that adding neurons can turn symmetric spurious minima into saddles through a local mechanism that does not generate new spurious minima; minima of smaller symmetry require more neurons. Using Cauchy's interlacing theorem, we prove the existence of descent directions in certain subspaces arising from the symmetry structure of the loss function. This analytic approach uses techniques, new to the field, from algebraic geometry, representation theory and symmetry breaking, and confirms rigorously the effectiveness of over-parameterization in making the associated loss landscape accessible to gradient-based methods. For a fixed number of neurons and inputs, the spectral results remain true under symmetry breaking perturbation of the target.

Chat is not available.