Timezone: »

Resolution Limits of Sparse Coding in High Dimensions
Alyson Fletcher · Sundeep Rangan · Vivek K Goyal

Tue Dec 09 07:30 PM -- 12:00 AM (PST) @ None #None
Recent research suggests that neural systems employ sparse coding. However, there is limited theoretical understanding of fundamental resolution limits in such sparse coding. This paper considers a general sparse estimation problem of detecting the sparsity pattern of a $k$-sparse vector in $\R^n$ from $m$ random noisy measurements. Our main results provide necessary and sufficient conditions on the problem dimensions, $m$, $n$ and $k$, and the signal-to-noise ratio (SNR) for asymptotically-reliable detection. We show a necessary condition for perfect recovery at any given SNR for all algorithms, regardless of complexity, is $m = \Omega(k\log(n-k))$ measurements. This is considerably stronger than all previous necessary conditions. We also show that the scaling of $\Omega(k\log(n-k))$ measurements is sufficient for a trivial ``maximum correlation'' estimator to succeed. Hence this scaling is optimal and does not require lasso, matching pursuit, or more sophisticated methods, and the optimal scaling can thus be biologically plausible.

Author Information

Allie Fletcher (UCLA)
Sundeep Rangan (Qualcomm)
Vivek K Goyal (Massachusetts Institute of Technology)

More from the Same Authors