`

Timezone: »

 
Second-order methods for nonconvex optimization with complexity guarantees
Stephen Wright

Fri Dec 13 05:00 PM -- 05:45 PM (PST) @ None

We consider problems of smooth nonconvex optimization: unconstrained, bound-constrained, and with general equality constraints. We show that algorithms for these problems that are widely used in practice can be modified slightly in ways that guarantees convergence to approximate first- and second-order optimal points with complexity guarantees that depend on the desired accuracy. The methods we discuss are constructed from Newton's method, the conjugate gradient method, log-barrier method, and augmented Lagrangians. (In some cases, special structure of the objective function makes for only a weak dependence on the accuracy parameter.) Our methods require Hessian information only in the form of Hessian-vector products, so do not require the Hessian to be evaluated and stored explicitly. This talk describes joint work with Clement Royer, Yue Xie, and Michael O'Neill.

Author Information

Stephen Wright (UW-Madison)

Steve Wright is a Professor of Computer Sciences at the University of Wisconsin-Madison. His research interests lie in computational optimization and its applications to science and engineering. Prior to joining UW-Madison in 2001, Wright was a Senior Computer Scientist (1997-2001) and Computer Scientist (1990-1997) at Argonne National Laboratory, and Professor of Computer Science at the University of Chicago (2000-2001). He is the past Chair of the Mathematical Optimization Society (formerly the Mathematical Programming Society), the leading professional society in optimization, and a member of the Board of the Society for Industrial and Applied Mathematics (SIAM). Wright is the author or co-author of four widely used books in numerical optimization, including "Primal Dual Interior-Point Methods" (SIAM, 1997) and "Numerical Optimization" (with J. Nocedal, Second Edition, Springer, 2006). He has also authored over 85 refereed journal papers on optimization theory, algorithms, software, and applications. He is coauthor of widely used interior-point software for linear and quadratic optimization. His recent research includes algorithms, applications, and theory for sparse optimization (including applications in compressed sensing and machine learning).

More from the Same Authors