Skip to yearly menu bar Skip to main content


Poster

Fantope Projection and Selection: A near-optimal convex relaxation of sparse PCA

Vincent Vu · Juhee Cho · Jing Lei · Karl Rohe

Harrah's Special Events Center, 2nd Floor

Abstract: We propose a novel convex relaxation of sparse principal subspace estimation based on the convex hull of rank-$d$ projection matrices (the Fantope). The convex problem can be solved efficiently using alternating direction method of multipliers (ADMM). We establish a near-optimal convergence rate, in terms of the sparsity, ambient dimension, and sample size, for estimation of the principal subspace of a general covariance matrix without assuming the spiked covariance model. In the special case of $d=1$, our result implies the near- optimality of DSPCA even when the solution is not rank 1. We also provide a general theoretical framework for analyzing the statistical properties of the method for arbitrary input matrices that extends the applicability and provable guarantees to a wide array of settings. We demonstrate this with an application to Kendall's tau correlation matrices and transelliptical component analysis.

Live content is unavailable. Log in and register to view live content