Timezone: »
In the realm of deep learning, the Fisher information matrix (FIM) gives novel insights and useful tools to characterize the loss landscape, perform second-order optimization, and build geometric learning theories. The exact FIM is either unavailable in closed form or too expensive to compute. In practice, it is almost always estimated based on empirical samples. We investigate two such estimators based on two equivalent representations of the FIM --- both unbiased and consistent. Their estimation quality is naturally gauged by their variance given in closed form. We analyze how the parametric structure of a deep neural network can affect the variance. The meaning of this variance measure and its upper bounds are then discussed in the context of deep learning.
Author Information
Alexander Soen (Australian National University)
Ke Sun (Data61 and Australian National University)
More from the Same Authors
-
2022 Poster: Fair Wrapping for Black-box Predictions »
Alexander Soen · Ibrahim Alabdulmohsin · Sanmi Koyejo · Yishay Mansour · Nyalleng Moorosi · Richard Nock · Ke Sun · Lexing Xie -
2021 Poster: Contrastive Laplacian Eigenmaps »
Hao Zhu · Ke Sun · Peter Koniusz -
2018 Poster: Representation Learning of Compositional Data »
Marta Avalos · Richard Nock · Cheng Soon Ong · Julien Rouar · Ke Sun