Poster
Differentially Private Covariance Estimation
Kareem Amin · Travis Dick · Alex Kulesza · Andres Munoz Medina · Sergei Vassilvitskii
East Exhibition Hall B, C #157
Keywords: [ Applications ] [ Privacy, Anonymity, and Security ] [ Algorithms ] [ Components Analysis (e.g., CCA, ICA, LDA, PCA) ]
The covariance matrix of a dataset is a fundamental statistic that can be used for calculating optimum regression weights as well as in many other learning and data analysis settings. For datasets containing private user information, we often want to estimate the covariance matrix in a way that preserves differential privacy. While there are known methods for privately computing the covariance matrix, they all have one of two major shortcomings. Some, like the Gaussian mechanism, only guarantee (epsilon, delta)-differential privacy, leaving a non-trivial probability of privacy failure. Others give strong epsilon-differential privacy guarantees, but are impractical, requiring complicated sampling schemes, and tend to perform poorly on real data.
In this work we propose a new epsilon-differentially private algorithm for computing the covariance matrix of a dataset that addresses both of these limitations. We show that it has lower error than existing state-of-the-art approaches, both analytically and empirically. In addition, the algorithm is significantly less complicated than other methods and can be efficiently implemented with rejection sampling.
Live content is unavailable. Log in and register to view live content