Skip to yearly menu bar Skip to main content


Poster

KNG: The K-Norm Gradient Mechanism

Matthew Reimherr · Jordan Awan

East Exhibition Hall B, C #92

Keywords: [ Algorithms ] [ Regression ] [ Applications ] [ Privacy, Anonymity, and Security ]


Abstract:

This paper presents a new mechanism for producing sanitized statistical summaries that achieve {\it differential privacy}, called the {\it K-Norm Gradient} Mechanism, or KNG. This new approach maintains the strong flexibility of the exponential mechanism, while achieving the powerful utility performance of objective perturbation. KNG starts with an inherent objective function (often an empirical risk), and promotes summaries that are close to minimizing the objective by weighting according to how far the gradient of the objective function is from zero. Working with the gradient instead of the original objective function allows for additional flexibility as one can penalize using different norms. We show that, unlike the exponential mechanism, the noise added by KNG is asymptotically negligible compared to the statistical error for many problems. In addition to theoretical guarantees on privacy and utility, we confirm the utility of KNG empirically in the settings of linear and quantile regression through simulations.

Live content is unavailable. Log in and register to view live content