Timezone: »

 
Contributed Video: Incremental Greedy BFGS: An Incremental Quasi-Newton Method with Explicit Superlinear Rate, Zhan Gao
Zhan Gao

Fri Dec 11 12:00 PM -- 12:30 PM (PST) @
Event URL: https://opt-ml.org/papers/2020/paper_65.pdf »

Finite-sum minimization, i.e., problems where the objective may be written as the sum over a collection of instantaneous costs, are ubiquitous in modern machine learning and data science. Efficient numerical techniques for their solution must trade off per-step complexity with the number of steps required for convergence. Incremental Quasi-Newton methods (IQN) achieve a favorable balance of these competing attributes in the sense that their complexity is independent of the sample size, while their convergence rate can be faster than linear. This local superlinear behavior, to date, however, is known only asymptotically. In this work, we put forth a new variant of IQN, specifically of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) type, that incorporates a greedy basis vector selection step, and admits a non-asymptotic explicit local superlinear rate. To the best of our knowledge, this is the first time an explicit superlinear rate has been given for Quasi-Newton methods in the incremental setting.

Author Information

Zhan Gao (University of Pennsylvania)

More from the Same Authors

  • 2020 : Contributed talks in Session 3 (Zoom) »
    Mark Schmidt · Zhan Gao · Wenjie Li · Preetum Nakkiran · Denny Wu · Chengrun Yang
  • 2020 : Poster Session 2 (gather.town) »
    Sharan Vaswani · Nicolas Loizou · Wenjie Li · Preetum Nakkiran · Zhan Gao · Sina Baghal · Jingfeng Wu · Roozbeh Yousefzadeh · Jinyi Wang · Jing Wang · Cong Xie · Anastasia Borovykh · Stanislaw Jastrzebski · Soham Dan · Yiliang Zhang · Mark Tuddenham · Sarath Pattathil · Ievgen Redko · Jeremy Cohen · Yasaman Esfandiari · Zhanhong Jiang · Mostafa ElAraby · Chulhee Yun · Michael Psenka · Robert Gower · Xiaoyu Wang