Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2023: Optimization for Machine Learning

Follow the flow: Proximal flow inspired multi-step methods

Yushen Huang · Yifan Sun


Abstract:

We investigate a family of Multi-Step Proximal Point Methods, the Backwards Differentiation For-mulas, which are inspired by implicit linear discretization of gradient flow. The resulting meth-ods are multi-step proximal point methods, with similar computational cost in each update as theproximal point method. We explore several optimization methods where applying an approximatemultistep proximal points method results in improved convergence behavior. We argue that this isthe result of the lowering of truncation error in approximating gradient flow.

Chat is not available.