Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Order up! The Benefits of Higher-Order Optimization in Machine Learning

On the Global Convergence of the Regularized Generalized Gauss-Newton Algorithm

Vincent Roulet · Maryam Fazel · Siddhartha Srinivasa · Zaid Harchaoui


Abstract:

We detail the global convergence rates of a regularized generalized Gauss-Newton algorithm applied to compositional problems with surjective inner Jacobian mappings. Our analysis uncovers several convergence phases for the algorithm and identifies the key condition numbers governing the complexity of the algorithm. We present an implementation with a line-search adaptive to the constants of the problem.

Chat is not available.