Contributed Video: Distributed Proximal Splitting Algorithms with Rates and Acceleration, Laurent Condat
Laurent Condat
2020 Talk
in
Workshop: OPT2020: Optimization for Machine Learning
in
Workshop: OPT2020: Optimization for Machine Learning
Abstract
We propose new generic distributed proximal splitting algorithms, well suited for large-scale convex nonsmooth optimization. We derive sublinear and linear convergence results with new nonergodic rates, as well as new accelerated versions of the algorithms, using varying stepsizes.
Video
Chat is not available.
Successful Page Load