Skip to yearly menu bar Skip to main content


Poster

Periodic Step Size Adaptation for Single Pass On-line Learning

Chun-Nan Hsu · Yu-Ming Chang · Hanshen Huang · Yuh-Jye Lee


Abstract:

It has been established that the second-order stochastic gradient descent (2SGD) method can potentially achieve generalization performance as well as empirical optimum in a single pass (i.e., epoch) through the training examples. However, 2SGD requires computing the inverse of the Hessian matrix of the loss function, which is prohibitively expensive. This paper presents Periodic Step-size Adaptation (PSA), which approximates the Jacobian matrix of the mapping function and explores a linear relation between the Jacobian and Hessian to approximate the Hessian periodically and achieve near-optimal results in experiments on a wide variety of models and tasks.

Live content is unavailable. Log in and register to view live content