Poster
Adapting to function difficulty and growth conditions in private optimization
Hilal Asi · Daniel Levy · John Duchi
Keywords: [ Privacy ] [ Optimization ]
Abstract:
We develop algorithms for private stochastic convex optimization that adapt to the hardness of the specific function we wish to optimize. While previous work provide worst-case bounds for arbitrary convex functions, it is often the case that the function at hand belongs to a smaller class that enjoys faster rates. Concretely, we show that for functions exhibiting κ-growth around the optimum, i.e., f(x)≥f(x⋆)+λκ−1∥x−x⋆∥κ2 for κ>1, our algorithms improve upon the standard √d/nε privacy rate to the faster (√d/nε)κκ−1. Crucially, they achieve these rates without knowledge of the growth constant κ of the function. Our algorithms build upon the inverse sensitivity mechanism, which adapts to instance difficulty [2], and recent localization techniques in private optimization [25]. We complement our algorithms with matching lower bounds for these function classes and demonstrate that our adaptive algorithm is simultaneously (minimax) optimal over all κ≥1+c whenever c=Θ(1).
Chat is not available.