How many samples is a good initial point worth in Low-rank Matrix Recovery?
Jialun Zhang, Richard Zhang
Spotlight presentation: Orals & Spotlights Track 32: Optimization
on 2020-12-10T19:30:00-08:00 - 2020-12-10T19:40:00-08:00
on 2020-12-10T19:30:00-08:00 - 2020-12-10T19:40:00-08:00
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: Given a sufficiently large amount of labeled data, the nonconvex low-rank matrix recovery problem contains no spurious local minima, so a local optimization algorithm is guaranteed to converge to a global minimum starting from any initial guess. However, the actual amount of data needed by this theoretical guarantee is very pessimistic, as it must prevent spurious local minima from existing anywhere, including at adversarial locations. In contrast, prior work based on good initial guesses have more realistic data requirements, because they allow spurious local minima to exist outside of a neighborhood of the solution. In this paper, we quantify the relationship between the quality of the initial guess and the corresponding reduction in data requirements. Using the restricted isometry constant as a surrogate for sample complexity, we compute a sharp “threshold” number of samples needed to prevent each specific point on the optimization landscape from becoming a spurious local minima. Optimizing the threshold over regions of the landscape, we see that, for initial points not too close to the ground truth, a linear improvement in the quality of the initial guess amounts to a constant factor improvement in the sample complexity.