Timezone: »
We determine statistical and computational limits for estimation of a rank-one matrix (the spike) corrupted by an additive gaussian noise matrix, in a sparse limit, where the underlying hidden vector (that constructs the rank-one matrix) has a number of non-zero components that scales sub-linearly with the total dimension of the vector, and the signal-to-noise ratio tends to infinity at an appropriate speed. We prove explicit low-dimensional variational formulas for the asymptotic mutual information between the spike and the observed noisy matrix and analyze the approximate message passing algorithm in the sparse regime. For Bernoulli and Bernoulli-Rademacher distributed vectors, and when the sparsity and signal strength satisfy an appropriate scaling relation, we find all-or-nothing phase transitions for the asymptotic minimum and algorithmic mean-square-errors. These jump from their maximum possible value to zero, at well defined signal-to-noise thresholds whose asymptotic values we determine exactly. In the asymptotic regime the statistical-to-algorithmic gap diverges indicating that sparse recovery is hard for approximate message passing.
Author Information
jean barbier (EPFL)
Nicolas Macris (EPFL)
Cynthia Rush (Columbia University)
Since September, 2016, Cynthia Rush has been an Assistant Professor in the Department of Statistics at Columbia University. In May, 2016, she received her Ph.D. in Statistics from Yale University under the supervision of Andrew Barron and she completed undergraduate coursework at the University of North Carolina at Chapel Hill where she obtained a B.S. in Mathematics. Her research uses tools and ideas from information theory, statistical physics, and applied probability as a framework for understanding modern, high-dimensional inference and estimation problems and complex machine learning tasks that are core challenges in the fields of current statistics and data science.
More from the Same Authors
-
2021 Poster: Model, sample, and epoch-wise descents: exact solution of gradient flow in the random feature model »
Antoine Bodin · Nicolas Macris -
2020 Poster: Information theoretic limits of learning a sparse rule »
Clément Luneau · jean barbier · Nicolas Macris -
2020 Spotlight: Information theoretic limits of learning a sparse rule »
Clément Luneau · jean barbier · Nicolas Macris -
2018 Poster: Entropy and mutual information in models of deep neural networks »
Marylou Gabrié · Andre Manoel · Clément Luneau · jean barbier · Nicolas Macris · Florent Krzakala · Lenka Zdeborová -
2018 Poster: The committee machine: Computational to statistical gaps in learning a two-layers neural network »
Benjamin Aubin · Antoine Maillard · jean barbier · Florent Krzakala · Nicolas Macris · Lenka Zdeborová -
2018 Spotlight: The committee machine: Computational to statistical gaps in learning a two-layers neural network »
Benjamin Aubin · Antoine Maillard · jean barbier · Florent Krzakala · Nicolas Macris · Lenka Zdeborová -
2018 Spotlight: Entropy and mutual information in models of deep neural networks »
Marylou Gabrié · Andre Manoel · Clément Luneau · jean barbier · Nicolas Macris · Florent Krzakala · Lenka Zdeborová -
2016 Poster: Mutual information for symmetric rank-one matrix estimation: A proof of the replica formula »
jean barbier · Mohamad Dia · Nicolas Macris · Florent Krzakala · Thibault Lesieur · Lenka Zdeborová