Poster
Proximal Newton-type Methods for Minimizing Convex Objective Functions in Composite Form
Jason D Lee · Yuekai Sun · Michael Saunders
Harrah’s Special Events Center 2nd Floor
[
Abstract
]
Abstract:
We consider minimizing convex objective functions in \emph{composite form} \minimizex∈\Rnf(x):=g(x)+h(x), where g is convex and twice-continuously differentiable and h:\Rn→\R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalization of Newton-type methods to handle such convex but nonsmooth objective functions. Many problems of relevance in high-dimensional statistics, machine learning, and signal processing can be formulated in composite form. We prove such methods are globally convergent to a minimizer and achieve quadratic rates of convergence in the vicinity of a unique minimizer. We also demonstrate the performance of such methods using problems of relevance in machine learning and high-dimensional statistics.
Live content is unavailable. Log in and register to view live content