Skip to yearly menu bar Skip to main content


Poster

DP-PCA: Statistically Optimal and Differentially Private PCA

Xiyang Liu · Weihao Kong · Prateek Jain · Sewoong Oh

Hall J (level 1) #538

Keywords: [ principal component analysis ] [ differential privacy ] [ private estimation ]


Abstract: We study the canonical statistical task of computing the principal component from i.i.d.~data under differential privacy. Although extensively studied in literature, existing solutions fall short on two key aspects: ($i$) even for Gaussian data, existing private algorithms require the number of samples $n$ to scale super-linearly with $d$, i.e., $n=\Omega(d^{3/2})$, to obtain non-trivial results while non-private PCA requires only $n=O(d)$, and ($ii$) existing techniques suffer from a large error even when the variance in each data point is small. We propose DP-PCA method that uses a single-pass minibatch gradient descent style algorithm to overcome the above limitations. For sub-Gaussian data, we provide nearly optimal statistical error rates even for $n=O(d \log d)$.

Chat is not available.