Skip to yearly menu bar Skip to main content


Poster

Parabolic Approximation Line Search for DNNs

Maximus Mutschler · Andreas Zell

Poster Session 5 #1633

Abstract:

A major challenge in current optimization research for deep learning is to automatically find optimal step sizes for each update step. The optimal step size is closely related to the shape of the loss in the update step direction. However, this shape has not yet been examined in detail. This work shows empirically that the sample loss over lines in negative gradient direction is mostly convex and well suited for one-dimensional parabolic approximations. Exploiting this parabolic property we introduce a simple and robust line search approach, which performs loss-shape dependent update steps. Our approach combines well-known methods such as parabolic approximation, line search and conjugate gradient, to perform efficiently. It successfully competes with common and state-of-the-art optimization methods on a large variety of experiments without the need of hand-designed step size schedules. Thus, it is of interest for objectives where step-size schedules are unknown or do not perform well. Our excessive evaluation includes multiple comprehensive hyperparameter grid searches on several datasets and architectures. We provide proof of convergence for an adapted scenario. Finally, we give a general investigation of exact line searches in the context of sample losses and exact losses, including their relation to our line search approach.

Chat is not available.