Skip to yearly menu bar Skip to main content


Poster

Differentially Private Robust Low-Rank Approximation

Raman Arora · Vladimir Braverman · Jalaj Upadhyay

Room 517 AB #145

Keywords: [ Privacy, Anonymity, and Security ]


Abstract: In this paper, we study the following robust low-rank matrix approximation problem: given a matrix $A \in \R^{n \times d}$, find a rank-$k$ matrix $B$, while satisfying differential privacy, such that $ \norm{ A - B }_p \leq \alpha \mathsf{OPT}_k(A) + \tau,$ where $\norm{ M }_p$ is the entry-wise $\ell_p$-norm and $\mathsf{OPT}_k(A):=\min_{\mathsf{rank}(X) \leq k} \norm{ A - X}_p$. It is well known that low-rank approximation w.r.t. entrywise $\ell_p$-norm, for $p \in [1,2)$, yields robustness to gross outliers in the data. We propose an algorithm that guarantees $\alpha=\widetilde{O}(k^2), \tau=\widetilde{O}(k^2(n+kd)/\varepsilon)$, runs in $\widetilde O((n+d)\poly~k)$ time and uses $O(k(n+d)\log k)$ space. We study extensions to the streaming setting where entries of the matrix arrive in an arbitrary order and output is produced at the very end or continually. We also study the related problem of differentially private robust principal component analysis (PCA), wherein we return a rank-$k$ projection matrix $\Pi$ such that $\norm{ A - A \Pi }_p \leq \alpha \mathsf{OPT}_k(A) + \tau.$

Live content is unavailable. Log in and register to view live content