Timezone: »
We consider generalized linear models in regimes where the number of nonzero components of the signal and accessible data points are sublinear with respect to the size of the signal. We prove a variational formula for the asymptotic mutual information per sample when the system size grows to infinity. This result allows us to derive an expression for the minimum mean-square error (MMSE) of the Bayesian estimator when the signal entries have a discrete distribution with finite support. We find that, for such signals and suitable vanishing scalings of the sparsity and sampling rate, the MMSE is nonincreasing piecewise constant. In specific instances the MMSE even displays an all-or-nothing phase transition, that is, the MMSE sharply jumps from its maximum value to zero at a critical sampling rate. The all-or-nothing phenomenon has previously been shown to occur in high-dimensional linear regression. Our analysis goes beyond the linear case and applies to learning the weights of a perceptron with general activation function in a teacher-student scenario. In particular, we discuss an all-or-nothing phenomenon for the generalization error with a sublinear set of training examples.
Author Information
Clément Luneau (École Polytechnique Fédérale de Lausanne)
jean barbier (EPFL)
Nicolas Macris (EPFL)
Related Events (a corresponding poster, oral, or spotlight)
-
2020 Poster: Information theoretic limits of learning a sparse rule »
Tue. Dec 8th 05:00 -- 07:00 PM Room Poster Session 1 #236
More from the Same Authors
-
2021 Poster: Model, sample, and epoch-wise descents: exact solution of gradient flow in the random feature model »
Antoine Bodin · Nicolas Macris -
2020 Poster: All-or-nothing statistical and computational phase transitions in sparse spiked matrix estimation »
jean barbier · Nicolas Macris · Cynthia Rush -
2018 Poster: Entropy and mutual information in models of deep neural networks »
Marylou Gabrié · Andre Manoel · Clément Luneau · jean barbier · Nicolas Macris · Florent Krzakala · Lenka Zdeborová -
2018 Poster: The committee machine: Computational to statistical gaps in learning a two-layers neural network »
Benjamin Aubin · Antoine Maillard · jean barbier · Florent Krzakala · Nicolas Macris · Lenka Zdeborová -
2018 Spotlight: The committee machine: Computational to statistical gaps in learning a two-layers neural network »
Benjamin Aubin · Antoine Maillard · jean barbier · Florent Krzakala · Nicolas Macris · Lenka Zdeborová -
2018 Spotlight: Entropy and mutual information in models of deep neural networks »
Marylou Gabrié · Andre Manoel · Clément Luneau · jean barbier · Nicolas Macris · Florent Krzakala · Lenka Zdeborová -
2016 Poster: Mutual information for symmetric rank-one matrix estimation: A proof of the replica formula »
jean barbier · Mohamad Dia · Nicolas Macris · Florent Krzakala · Thibault Lesieur · Lenka Zdeborová