Timezone: »
Poster
A Convergent Gradient Descent Algorithm for Rank Minimization and Semidefinite Programming from Random Linear Measurements
Qinqing Zheng · John Lafferty
We propose a simple, scalable, and fast gradient descent algorithm to optimize a nonconvex objective for the rank minimization problem and a closely related family of semidefinite programs. With $O(r^3 \kappa^2 n \log n)$ random measurements of a positive semidefinite $n\times n$ matrix of rank $r$ and condition number $\kappa$, our method is guaranteed to converge linearly to the global optimum.
Author Information
Qinqing Zheng (University of Chicago)
John Lafferty (University of Chicago)
More from the Same Authors
-
2016 Workshop: Adaptive and Scalable Nonparametric Methods in Machine Learning »
Aaditya Ramdas · Arthur Gretton · Bharath Sriperumbudur · Han Liu · John Lafferty · Samory Kpotufe · Zoltán Szabó -
2015 Poster: Interpolating Convex and Non-Convex Tensor Decompositions via the Subspace Norm »
Qinqing Zheng · Ryota Tomioka