Abstract:
We show that given an estimate ˆ\matA that is close to a general high-rank positive semi-definite (PSD) matrix \matA in spectral norm (i.e., ∥ˆ\matA−\matA∥2≤δ), the simple truncated Singular Value Decomposition of ˆ\matA produces a multiplicative approximation of \matA in Frobenius norm. This observation leads to many interesting results on general high-rank matrix estimation problems: 1.High-rank matrix completion: we show that it is possible to recover a {general high-rank matrix} \matA up to (1+ε) relative error in Frobenius norm from partial observations, with sample complexity independent of the spectral gap of \matA. 2.High-rank matrix denoising: we design algorithms that recovers a matrix \matA with relative error in Frobenius norm from its noise-perturbed observations, without assuming \matA is exactly low-rank. 3.Low-dimensional estimation of high-dimensional covariance: given N i.i.d.~samples of dimension n from Nn(\mat0,\matA), we show that it is possible to estimate the covariance matrix \matA with relative error in Frobenius norm with N≈n,improving over classical covariance estimation results which requires N≈n2.
Live content is unavailable. Log in and register to view live content