Poster
Differentially Private Stochastic Optimization: New Results in Convex and Non-Convex Settings
Raef Bassily · Cristóbal Guzmán · Michael Menart
Keywords: [ Privacy ] [ Optimization ]
Abstract:
We study differentially private stochastic optimization in convex and non-convex settings. For the convex case, we focus on the family of non-smooth generalized linear losses (GLLs). Our algorithm for the setting achieves optimal excess population risk in near-linear time, while the best known differentially private algorithms for general convex losses run in super-linear time. Our algorithm for the setting has nearly-optimal excess population risk , and circumvents the dimension dependent lower bound of \cite{Asi:2021} for general non-smooth convex losses. In the differentially private non-convex setting, we provide several new algorithms for approximating stationary points of the population risk. For the -case with smooth losses and polyhedral constraint, we provide the first nearly dimension independent rate, in linear time. For the constrained -case, with smooth losses, we obtain a linear-time algorithm with rate . Finally, for the -case we provide the first method for {\em non-smooth weakly convex} stochastic optimization with rate which matches the best existing non-private algorithm when . We also extend all our results above for the non-convex setting to the setting, where , with only polylogarithmic (in the dimension) overhead in the rates.
Chat is not available.