Skip to yearly menu bar Skip to main content


Poster

Resolution Limits of Sparse Coding in High Dimensions

Alyson Fletcher · Sundeep Rangan · Vivek K Goyal


Abstract: Recent research suggests that neural systems employ sparse coding. However, there is limited theoretical understanding of fundamental resolution limits in such sparse coding. This paper considers a general sparse estimation problem of detecting the sparsity pattern of a $k$-sparse vector in $\R^n$ from $m$ random noisy measurements. Our main results provide necessary and sufficient conditions on the problem dimensions, $m$, $n$ and $k$, and the signal-to-noise ratio (SNR) for asymptotically-reliable detection. We show a necessary condition for perfect recovery at any given SNR for all algorithms, regardless of complexity, is $m = \Omega(k\log(n-k))$ measurements. This is considerably stronger than all previous necessary conditions. We also show that the scaling of $\Omega(k\log(n-k))$ measurements is sufficient for a trivial ``maximum correlation'' estimator to succeed. Hence this scaling is optimal and does not require lasso, matching pursuit, or more sophisticated methods, and the optimal scaling can thus be biologically plausible.

Live content is unavailable. Log in and register to view live content