Skip to yearly menu bar Skip to main content


Poster

On the Power of Truncated SVD for General High-rank Matrix Estimation Problems

Simon Du · Yining Wang · Aarti Singh

Pacific Ballroom #221

Keywords: [ Theory ]


Abstract: We show that given an estimate ˆ\matA that is close to a general high-rank positive semi-definite (PSD) matrix \matA in spectral norm (i.e., ˆ\matA\matA2δ), the simple truncated Singular Value Decomposition of ˆ\matA produces a multiplicative approximation of \matA in Frobenius norm. This observation leads to many interesting results on general high-rank matrix estimation problems: 1.High-rank matrix completion: we show that it is possible to recover a {general high-rank matrix} \matA up to (1+ε) relative error in Frobenius norm from partial observations, with sample complexity independent of the spectral gap of \matA. 2.High-rank matrix denoising: we design algorithms that recovers a matrix \matA with relative error in Frobenius norm from its noise-perturbed observations, without assuming \matA is exactly low-rank. 3.Low-dimensional estimation of high-dimensional covariance: given N i.i.d.~samples of dimension n from Nn(\mat0,\matA), we show that it is possible to estimate the covariance matrix \matA with relative error in Frobenius norm with Nn,improving over classical covariance estimation results which requires Nn2.

Live content is unavailable. Log in and register to view live content