`

Timezone: »

 
Non-Euclidean Differentially Private Stochastic Convex Optimization, Cristóbal Guzmán
Cristóbal Guzmán

Mon Dec 13 07:30 AM -- 07:55 AM (PST) @ None
Abstract: Ensuring privacy of users' data in machine learning models has become a crucial requirement in multiple domains. In this respect, differential privacy (DP) is the gold standard, due to its general and rigorous privacy guarantees, as well as its high composability. For the particular case of stochastic convex optimization (SCO), recent efforts have established optimal rates for the excess risk under differential privacy in Euclidean setups. These bounds suffer a polynomial degradation of accuracy with respect to the dimension, which limits their applicability in high-dimensional settings. In this talk, I will present nearly-dimension independent rates on the excess risk for DP-SCO in the $\ell_1$ setup, as well as the investigation of more general $\ell_p$ setups, where $1\leq p\leq \infty$. Based on joint work with Raef Bassily and Anupama Nandi.

Author Information

Cristóbal Guzmán (U of Twente)

More from the Same Authors