Timezone: »

Taxonomizing local versus global structure in neural network loss landscapes
Yaoqing Yang · Liam Hodgkinson · Ryan Theisen · Joe Zou · Joseph Gonzalez · Kannan Ramchandran · Michael W Mahoney

Tue Dec 07 04:30 PM -- 06:00 PM (PST) @ None #None

Viewing neural network models in terms of their loss landscapes has a long history in the statistical mechanics approach to learning, and in recent years it has received attention within machine learning proper. Among other things, local metrics (such as the smoothness of the loss landscape) have been shown to correlate with global properties of the model (such as good generalization performance). Here, we perform a detailed empirical analysis of the loss landscape structure of thousands of neural network models, systematically varying learning tasks, model architectures, and/or quantity/quality of data. By considering a range of metrics that attempt to capture different aspects of the loss landscape, we demonstrate that the best test accuracy is obtained when: the loss landscape is globally well-connected; ensembles of trained models are more similar to each other; and models converge to locally smooth regions. We also show that globally poorly-connected landscapes can arise when models are small or when they are trained to lower quality data; and that, if the loss landscape is globally poorly-connected, then training to zero loss can actually lead to worse test accuracy. Our detailed empirical results shed light on phases of learning (and consequent double descent behavior), fundamental versus incidental determinants of good generalization, the role of load-like andtemperature-like parameters in the learning process, different influences on the loss landscape from model and data, and the relationships between local and global metrics, all topics of recent interest.

Author Information

Yaoqing Yang (UC Berkeley)
Liam Hodgkinson (UC Berkeley)
Ryan Theisen (University of California Berkeley)
Joe Zou (University of California Berkeley)
Joseph Gonzalez (UC Berkeley)
Kannan Ramchandran (UC Berkeley)
Michael W Mahoney (UC Berkeley)

More from the Same Authors