Timezone: »
Poster
Differentially Private Stochastic Optimization: New Results in Convex and NonConvex Settings
Raef Bassily · Cristóbal Guzmán · Michael Menart
We study differentially private stochastic optimization in convex and nonconvex settings. For the convex case, we focus on the family of nonsmooth generalized linear losses (GLLs). Our algorithm for the $\ell_2$ setting achieves optimal excess population risk in nearlinear time, while the best known differentially private algorithms for general convex losses run in superlinear time. Our algorithm for the $\ell_1$ setting has nearlyoptimal excess population risk $\tilde{O}\big(\sqrt{\frac{\log{d}}{n}}\big)$, and circumvents the dimension dependent lower bound of \cite{Asi:2021} for general nonsmooth convex losses. In the differentially private nonconvex setting, we provide several new algorithms for approximating stationary points of the population risk. For the $\ell_1$case with smooth losses and polyhedral constraint, we provide the first nearly dimension independent rate, $\tilde O\big(\frac{\log^{2/3}{d}}{{n^{1/3}}}\big)$ in linear time. For the constrained $\ell_2$case, with smooth losses, we obtain a lineartime algorithm with rate $\tilde O\big(\frac{1}{n^{3/10}d^{1/10}}+\big(\frac{d}{n^2}\big)^{1/5}\big)$. Finally, for the $\ell_2$case we provide the first method for {\em nonsmooth weakly convex} stochastic optimization with rate $\tilde O\big(\frac{1}{n^{1/4}}+\big(\frac{d}{n^2}\big)^{1/6}\big)$ which matches the best existing nonprivate algorithm when $d= O(\sqrt{n})$. We also extend all our results above for the nonconvex $\ell_2$ setting to the $\ell_p$ setting, where $1 < p \leq 2$, with only polylogarithmic (in the dimension) overhead in the rates.
Author Information
Raef Bassily (The Ohio State University)
Cristóbal Guzmán (U of Twente)
Michael Menart (Ohio State University)
More from the Same Authors

2021 : Q&A with Cristóbal Guzmán »
Cristóbal Guzmán 
2021 : NonEuclidean Differentially Private Stochastic Convex Optimization, Cristóbal Guzmán »
Cristóbal Guzmán 
2021 Poster: Bestcase lower bounds in online learning »
Cristóbal Guzmán · Nishant Mehta · Ali Mortazavi 
2020 Poster: Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses »
Raef Bassily · Vitaly Feldman · Cristóbal Guzmán · Kunal Talwar 
2020 Spotlight: Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses »
Raef Bassily · Vitaly Feldman · Cristóbal Guzmán · Kunal Talwar 
2020 Poster: Learning from Mixtures of Private and Public Populations »
Raef Bassily · Shay Moran · Anupama Nandi