Skip to yearly menu bar Skip to main content


Multiple talks
in
Workshop: OPT2020: Optimization for Machine Learning

Contributed talks in Session 2 (Zoom)

Martin Takac · Samuel Horváth · Guan-Horng Liu · Nicolas Loizou · Sharan Vaswani


Abstract:

Join us to hear some new, exciting work at the intersection of optimization and ML. Come and ask questions and join the discussion.

Speakers: Samuel Horvath, "Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization" Guan-Horng Liu, "DDPNOpt: Differential Dynamic Programming Neural Optimizer" Nicolas Loizou, "Stochastic Polyak Step-size for SGD: An Adaptive Learning Rate for Fast Convergence" Sharan Vaswani, "Adaptive Gradient Methods Converge Faster with Over-Parameterization (and you can do a line-search)" Sharan Vaswani, "How to make your optimizer generalize better"

You can find a video on the NeurIPS website where the speakers discuss in detail their paper.