Timezone: »

 
Poster
Efficient Convex Relaxations for Streaming PCA
Raman Arora · Teodor Vanislavov Marinov

Thu Dec 12 05:00 PM -- 07:00 PM (PST) @ East Exhibition Hall B + C #57
We revisit two algorithms, matrix stochastic gradient (MSG) and $\ell_2$-regularized MSG (RMSG), that are instances of stochastic gradient descent (SGD) on a convex relaxation to principal component analysis (PCA). These algorithms have been shown to outperform Oja’s algorithm, empirically, in terms of the iteration complexity, and to have runtime comparable with Oja’s. However, these findings are not supported by existing theoretical results. While the iteration complexity bound for $\ell_2$-RMSG was recently shown to match that of Oja’s algorithm, its theoretical efficiency was left as an open problem. In this work, we give improved bounds on per iteration cost of mini-batched variants of both MSG and $\ell_2$-RMSG and arrive at an algorithm with total computational complexity matching that of Oja's algorithm.

Author Information

Raman Arora (Johns Hopkins University)
Teodor Vanislavov Marinov (Johns Hopkins University)

More from the Same Authors