Timezone: »

Rank Overspecified Robust Matrix Recovery: Subgradient Method and Exact Recovery
Lijun Ding · Liwei Jiang · Yudong Chen · Qing Qu · Zhihui Zhu

Wed Dec 08 12:30 AM -- 02:00 AM (PST) @
We study the robust recovery of a low-rank matrix from sparsely and grossly corrupted Gaussian measurements, with no prior knowledge on the intrinsic rank. We consider the robust matrix factorization approach. We employ a robust $\ell_1$ loss function and deal with the challenge of the unknown rank by using an overspecified factored representation of the matrix variable. We then solve the associated nonconvex nonsmooth problem using a subgradient method with diminishing stepsizes. We show that under a regularity condition on the sensing matrices and corruption, which we call restricted direction preserving property (RDPP), even with rank overspecified, the subgradient method converges to the exact low-rank solution at a sublinear rate. Moreover, our result is more general in the sense that it automatically speeds up to a linear rate once the factor rank matches the unknown rank. On the other hand, we show that the RDPP condition holds under generic settings, such as Gaussian measurements under independent or adversarial sparse corruptions, where the result could be of independent interest. Both the exact recovery and the convergence rate of the proposed subgradient method are numerically verified in the overspecified regime. Moreover, our experiment further shows that our particular design of diminishing stepsize effectively prevents overfitting for robust recovery under overparameterized models, such as robust matrix sensing and learning robust deep image prior. This regularization effect is worth further investigation.

Author Information

Lijun Ding (Cornell University)
Liwei Jiang (Cornell University)
Yudong Chen (Cornell University)
Qing Qu (University of Michigan)
Zhihui Zhu (University of Denver)

More from the Same Authors