Identification, Amplification and Measurement: A bridge to Gaussian Differential Privacy

Yi Liu · Ke Sun · Bei Jiang · Linglong Kong

Hall J #922

Keywords: [ Privacy profile ] [ differential privacy ] [ Gaussian differential privacy ]

[ Abstract ]
[ Paper [ Poster [ OpenReview
Thu 1 Dec 2 p.m. PST — 4 p.m. PST
Spotlight presentation: Lightning Talks 2B-4
Tue 6 Dec 6:30 p.m. PST — 6:45 p.m. PST

Abstract: Gaussian differential privacy (GDP) is a single-parameter family of privacy notions that provides coherent guarantees to avoid the exposure of sensitive individual information. Despite the extra interpretability and tighter bounds under composition GDP provides, many widely used mechanisms (e.g., the Laplace mechanism) inherently provide GDP guarantees but often fail to take advantage of this new framework because their privacy guarantees were derived under a different background. In this paper, we study the asymptotic properties of privacy profiles and develop a simple criterion to identify algorithms with GDP properties. We propose an efficient method for GDP algorithms to narrow down possible values of an optimal privacy measurement, $\mu$ with an arbitrarily small and quantifiable margin of error. For non GDP algorithms, we provide a post-processing procedure that can amplify existing privacy guarantees to meet the GDP condition. As applications, we compare two single-parameter families of privacy notions, $\epsilon$-DP, and $\mu$-GDP, and show that all $\epsilon$-DP algorithms are intrinsically also GDP. Lastly, we show that the combination of our measurement process and the composition theorem of GDP is a powerful and convenient tool to handle compositions compared to the traditional standard and advanced composition theorems.

Chat is not available.