Timezone: »

 
Poster
Adaptive First-Order Methods Revisited: Convex Minimization without Lipschitz Requirements
Kimon Antonakopoulos · Panayotis Mertikopoulos

Wed Dec 08 12:30 AM -- 02:00 AM (PST) @

We propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth in the standard sense. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function – as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range ofproblems with singular objective, such as Fisher markets, Poisson tomography, D-design, and the like. In this setting, the application of existing order-optimal adaptive methods – like UnixGrad or AcceleGrad – is not possible, especially in the presence of randomness and uncertainty. The proposed method, adaptive mirror descent (AdaMir), aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones.

Author Information

Kimon Antonakopoulos (INRIA)
Panayotis Mertikopoulos (CNRS (French National Center for Scientific Research) and Criteo AI Lab)

More from the Same Authors