Timezone: »
We determine statistical and computational limits for estimation of a rankone matrix (the spike) corrupted by an additive gaussian noise matrix, in a sparse limit, where the underlying hidden vector (that constructs the rankone matrix) has a number of nonzero components that scales sublinearly with the total dimension of the vector, and the signaltonoise ratio tends to infinity at an appropriate speed. We prove explicit lowdimensional variational formulas for the asymptotic mutual information between the spike and the observed noisy matrix and analyze the approximate message passing algorithm in the sparse regime. For Bernoulli and BernoulliRademacher distributed vectors, and when the sparsity and signal strength satisfy an appropriate scaling relation, we find allornothing phase transitions for the asymptotic minimum and algorithmic meansquareerrors. These jump from their maximum possible value to zero, at well defined signaltonoise thresholds whose asymptotic values we determine exactly. In the asymptotic regime the statisticaltoalgorithmic gap diverges indicating that sparse recovery is hard for approximate message passing.
Author Information
jean barbier (EPFL)
Nicolas Macris (EPFL)
Cynthia Rush (Columbia University)
Since September, 2016, Cynthia Rush has been an Assistant Professor in the Department of Statistics at Columbia University. In May, 2016, she received her Ph.D. in Statistics from Yale University under the supervision of Andrew Barron and she completed undergraduate coursework at the University of North Carolina at Chapel Hill where she obtained a B.S. in Mathematics. Her research uses tools and ideas from information theory, statistical physics, and applied probability as a framework for understanding modern, highdimensional inference and estimation problems and complex machine learning tasks that are core challenges in the fields of current statistics and data science.
More from the Same Authors

2021 Poster: Model, sample, and epochwise descents: exact solution of gradient flow in the random feature model »
Antoine Bodin · Nicolas Macris 
2020 Poster: Information theoretic limits of learning a sparse rule »
Clément Luneau · jean barbier · Nicolas Macris 
2020 Spotlight: Information theoretic limits of learning a sparse rule »
Clément Luneau · jean barbier · Nicolas Macris 
2018 Poster: Entropy and mutual information in models of deep neural networks »
Marylou Gabrié · Andre Manoel · Clément Luneau · jean barbier · Nicolas Macris · Florent Krzakala · Lenka Zdeborová 
2018 Poster: The committee machine: Computational to statistical gaps in learning a twolayers neural network »
Benjamin Aubin · Antoine Maillard · jean barbier · Florent Krzakala · Nicolas Macris · Lenka Zdeborová 
2018 Spotlight: The committee machine: Computational to statistical gaps in learning a twolayers neural network »
Benjamin Aubin · Antoine Maillard · jean barbier · Florent Krzakala · Nicolas Macris · Lenka Zdeborová 
2018 Spotlight: Entropy and mutual information in models of deep neural networks »
Marylou Gabrié · Andre Manoel · Clément Luneau · jean barbier · Nicolas Macris · Florent Krzakala · Lenka Zdeborová 
2016 Poster: Mutual information for symmetric rankone matrix estimation: A proof of the replica formula »
jean barbier · Mohamad Dia · Nicolas Macris · Florent Krzakala · Thibault Lesieur · Lenka Zdeborová