Skip to yearly menu bar Skip to main content


Poster

Conic Descent and its Application to Memory-efficient Optimization over Positive Semidefinite Matrices

John Duchi · Oliver Hinder · Andrew Naber · Yinyu Ye

Poster Session 5 #1637

Abstract:

We present an extension of the conditional gradient method to problems whose feasible sets are convex cones. We provide a convergence analysis for the method and for variants with nonconvex objectives, and we extend the analysis to practical cases with effective line search strategies. For the specific case of the positive semidefinite cone, we present a memory-efficient version based on randomized matrix sketches and advocate a heuristic greedy step that greatly improves its practical performance. Numerical results on phase retrieval and matrix completion problems indicate that our method can offer substantial advantages over traditional conditional gradient and Burer-Monteiro approaches.

Chat is not available.