Timezone: »

A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions
Damek Davis · Dmitriy Drusvyatskiy · Yin Tat Lee · Swati Padmanabhan · Guanghao Ye

Thu Dec 01 09:00 AM -- 11:00 AM (PST) @ Hall J #837
Zhang et al. (ICML 2020) introduced a novel modification of Goldstein's classical subgradient method, with an efficiency guarantee of $O(\varepsilon^{-4})$ for minimizing Lipschitz functions. Their work, however, makes use of an oracle that is not efficiently implementable. In this paper, we obtain the same efficiency guarantee with a standard subgradient oracle, thus making our algorithm efficiently implementable. Our resulting method works on any Lipschitz function whose value and gradient can be evaluated at points of differentiability. We additionally present a new cutting plane algorithm that achieves an efficiency of $O(d\varepsilon^{-2}\log S)$ for the class of $S$-smooth (and possibly non-convex) functions in low dimensions. Strikingly, this $\epsilon$-dependence matches the lower bounds for the convex setting.

Author Information

Damek Davis (Cornell University)

Damek Davis is an Associate Professor of Operations Research at Cornell University. His research focuses on the interplay of optimization, signal processing, statistics, and machine learning. He has received several awards for his work, including a Sloan Research Fellowship in Mathematics (2020), the INFORMS Optimization Society Young Researchers Prize (2019), and an NSF CAREER Award (2021).

Dmitriy Drusvyatskiy (University of Washington)
Yin Tat Lee (UW)
Swati Padmanabhan (University of Washington, Seattle)
Guanghao Ye (Massachusetts Institute of Technology)

I am a second-year PhD student at MIT Math.

More from the Same Authors