A Reparametrization-Invariant Sharpness Measure Based on Information Geometry

Cheongjae Jang · Sungyoon Lee · Frank Park · Yung-Kyun Noh

Hall J #415

Keywords: [ Information geometry ] [ Sharpness measure ] [ Reparametrization invariance ] [ generalization ] [ the Fisher information matrix ] [ Deep Learning ]

[ Abstract ]
[ Paper [ Poster [ OpenReview
Tue 29 Nov 2 p.m. PST — 4 p.m. PST


It has been observed that the generalization performance of neural networks correlates with the sharpness of their loss landscape. Dinh et al. (2017) have observed that existing formulations of sharpness measures fail to be invariant with respect to scaling and reparametrization. While some scale-invariant measures have recently been proposed, reparametrization-invariant measures are still lacking. Moreover, they often do not provide any theoretical insights into generalization performance nor lead to practical use to improve the performance. Based on an information geometric analysis of the neural network parameter space, in this paper we propose a reparametrization-invariant sharpness measure that captures the change in loss with respect to changes in the probability distribution modeled by neural networks, rather than with respect to changes in the parameter values. We reveal some theoretical connections of our measure to generalization performance. In particular, experiments confirm that using our measure as a regularizer in neural network training significantly improves performance.

Chat is not available.