Timezone: »

Integration Methods and Optimization Algorithms
Damien Scieur · Vincent Roulet · Francis Bach · Alexandre d'Aspremont

Mon Dec 06:30 PM -- 10:30 PM PST @ Pacific Ballroom #174 #None

We show that accelerated optimization methods can be seen as particular instances of multi-step integration schemes from numerical analysis, applied to the gradient flow equation. Compared with recent advances in this vein, the differential equation considered here is the basic gradient flow, and we derive a class of multi-step schemes which includes accelerated algorithms, using classical conditions from numerical analysis. Multi-step schemes integrate the differential equation using larger step sizes, which intuitively explains the acceleration phenomenon.

Author Information

Damien Scieur (INRIA - ENS)
Vincent Roulet (University of Washington)
Francis Bach (Inria)

Francis Bach is a researcher at INRIA, leading since 2011 the SIERRA project-team, which is part of the Computer Science Department at Ecole Normale Supérieure in Paris, France. After completing his Ph.D. in Computer Science at U.C. Berkeley, he spent two years at Ecole des Mines, and joined INRIA and Ecole Normale Supérieure in 2007. He is interested in statistical machine learning, and especially in convex optimization, combinatorial optimization, sparse methods, kernel-based learning, vision and signal processing. He gave numerous courses on optimization in the last few years in summer schools. He has been program co-chair for the International Conference on Machine Learning in 2015.

Alexandre d'Aspremont (CNRS - Ecole Normale Supérieure)

More from the Same Authors