Skip to yearly menu bar Skip to main content


Plenary Speaker
in
Workshop: OPT 2021: Optimization for Machine Learning

Putting Randomized Matrix Algorithms in LAPACK, and Connections with Second-order Stochastic Optimization, Michael Mahoney

Michael Mahoney


Abstract:

Abstract: LAPACK (Linear Algebra PACKage) is a widely-used high-quality software library for numerical linear algebra that provides routines for solving systems of linear equations, linear least squares, eigenvalue problems, singular value decomposition, and matrix factorizations such as LU, QR, Cholesky and Schur decomposition. Randomized Numerical Linear Algebra (RandNLA) is an interdisciplinary research area that exploits randomization as a computational resource to develop improved algorithms for large-scale linear algebra problems. In addition to providing some of the best linear algebra algorithms (in worst-case theory, numerical implementations, and non-trivial implicit statistical properties and machine learning implementations), RandNLA techniques are the basis for the best stochastic second-order optimization algorithms (such as SubSampled Newton's methods, Iterative Hessian Sketch, and Newton-LESS). The time has come to put RandNLA methods into the next generation of LAPACK, and we have begun to do that. We will present our high level plan to introduce RandNLA algorithms into LAPACK. The RandLAPACK library will implement state of the art randomized algorithms for problems such as low-rank approximation and least-squares, but its full scope will be larger than linear algebra per se. It will include certain higher-level primitives in optimization and machine learning that require judicious use of RandNLA. We will describe building blocks, the modular design that will help RandLAPACK evolve with the field of RandNLA (as well as the needs of machine learning, scientific computing, and other users) over time, as well as connections and implications for machine learning optimization. Joint work with Riley Murray, Jim Demmel, and others.