Timezone: »
We study the decentralized distributed computation of discrete approximations for the regularized Wasserstein barycenter of a finite set of continuous probability measures distributedly stored over a network. We assume there is a network of agents/machines/computers, and each agent holds a private continuous probability measure and seeks to compute the barycenter of all the measures in the network by getting samples from its local measure and exchanging information with its neighbors. Motivated by this problem, we develop, and analyze, a novel accelerated primal-dual stochastic gradient method for general stochastic convex optimization problems with linear equality constraints. Then, we apply this method to the decen- tralized distributed optimization setting to obtain a new algorithm for the distributed semi-discrete regularized Wasserstein barycenter problem. Moreover, we show explicit non-asymptotic complexity for the proposed algorithm. Finally, we show the effectiveness of our method on the distributed computation of the regularized Wasserstein barycenter of univariate Gaussian and von Mises distributions, as well as some applications to image aggregation.
Author Information
Pavel Dvurechenskii (Weierstrass Institute, Berlin)
Since 2015 Research assistant, Research Group 6 "Stochastic Algorithms and Nonparametric Statistics", Weierstrass Institute for Applied Analysis and Stochastics, Berlin 2014 - 2015 Research assistant, Institute for Information Transmission Problems, Moscow, Russia 2009 - 2015 Junior researcher, Moscow Institute of Physics and Technology, Moscow, Russia 2013 Ph.D., Moscow Institute of Physics and Technology, Moscow, Russia 2010 Master's Diploma, Moscow Institute of Physics and Technology, Moscow, Russia 2008 Bachelor's Diploma, Moscow Institute of Physics and Technology, Moscow, Russia
Darina Dvinskikh (WIAS im Forschungsverbund Berlin e. V.)
2018 Master's Diploma, Moscow Institute of Physics and Technology, Moscow, Russia 2018 Master's Diploma, Skolkovo Institute of Science and Technology, Moscow, Russia 20016 Bachelor's Diploma, Moscow Institute of Physics and Technology, Moscow, Russia
Alexander Gasnikov (SkolTech)
Cesar Uribe (Massachusetts Institute of Technology)
Angelia Nedich (Arizona State University)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: Decentralize and Randomize: Faster Algorithm for Wasserstein Barycenters »
Thu. Dec 6th through Fri the 7th Room Room 210 #15
More from the Same Authors
-
2021 : Decentralized Personalized Federated Learning: Lower Bounds and Optimal Algorithm for All Personalization Modes »
Abdurakhmon Sadiev · Ekaterina Borodich · Darina Dvinskikh · Aleksandr Beznosikov · Alexander Gasnikov -
2021 : Decentralized Personalized Federated Learning: Lower Bounds and Optimal Algorithm for All Personalization Modes »
Abdurakhmon Sadiev · Ekaterina Borodich · Darina Dvinskikh · Aleksandr Beznosikov · Alexander Gasnikov -
2021 : Decentralized Personalized Federated Min-Max Problems »
Ekaterina Borodich · Aleksandr Beznosikov · Abdurakhmon Sadiev · Vadim Sushko · Alexander Gasnikov -
2022 : Effects of momentum scaling for SGD »
Dmitry A. Pasechnyuk · Alexander Gasnikov · Martin Takac -
2023 Poster: Accelerated Zeroth-order Method for Non-Smooth Stochastic Convex Optimization Problem with Infinite Variance »
Nikita Kornilov · Ohad Shamir · Aleksandr Lobanov · Alexander Gasnikov · Innokentiy Shibaev · Eduard Gorbunov · Darina Dvinskikh · Samuel Horváth -
2023 Poster: Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities »
Aleksandr Beznosikov · Martin Takac · Alexander Gasnikov -
2023 Poster: First Order Methods with Markovian Noise: from Acceleration to Variational Inequalities »
Aleksandr Beznosikov · Sergey Samsonov · Marina Sheshukova · Alexander Gasnikov · Alexey Naumov · Eric Moulines -
2022 Spotlight: Accelerated Primal-Dual Gradient Method for Smooth and Convex-Concave Saddle-Point Problems with Bilinear Coupling »
Dmitry Kovalev · Alexander Gasnikov · Peter Richtarik -
2022 Spotlight: The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization »
Dmitry Kovalev · Alexander Gasnikov -
2022 Spotlight: Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees »
Aleksandr Beznosikov · Peter Richtarik · Michael Diskin · Max Ryabinin · Alexander Gasnikov -
2022 Spotlight: Optimal Algorithms for Decentralized Stochastic Variational Inequalities »
Dmitry Kovalev · Aleksandr Beznosikov · Abdurakhmon Sadiev · Michael Persiianov · Peter Richtarik · Alexander Gasnikov -
2022 Spotlight: Optimal Gradient Sliding and its Application to Optimal Distributed Optimization Under Similarity »
Dmitry Kovalev · Aleksandr Beznosikov · Ekaterina Borodich · Alexander Gasnikov · Gesualdo Scutari -
2022 Spotlight: The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization »
Dmitry Kovalev · Alexander Gasnikov -
2022 Spotlight: Decentralized Local Stochastic Extra-Gradient for Variational Inequalities »
Aleksandr Beznosikov · Pavel Dvurechenskii · Anastasiia Koloskova · Valentin Samokhin · Sebastian Stich · Alexander Gasnikov -
2022 Poster: Optimal Gradient Sliding and its Application to Optimal Distributed Optimization Under Similarity »
Dmitry Kovalev · Aleksandr Beznosikov · Ekaterina Borodich · Alexander Gasnikov · Gesualdo Scutari -
2022 Poster: Clipped Stochastic Methods for Variational Inequalities with Heavy-Tailed Noise »
Eduard Gorbunov · Marina Danilova · David Dobre · Pavel Dvurechenskii · Alexander Gasnikov · Gauthier Gidel -
2022 Poster: The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization »
Dmitry Kovalev · Alexander Gasnikov -
2022 Poster: A Damped Newton Method Achieves Global $\mathcal O \left(\frac{1}{k^2}\right)$ and Local Quadratic Convergence Rate »
Slavomír Hanzely · Dmitry Kamzolov · Dmitry Pasechnyuk · Alexander Gasnikov · Peter Richtarik · Martin Takac -
2022 Poster: The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization »
Dmitry Kovalev · Alexander Gasnikov -
2022 Poster: Optimal Algorithms for Decentralized Stochastic Variational Inequalities »
Dmitry Kovalev · Aleksandr Beznosikov · Abdurakhmon Sadiev · Michael Persiianov · Peter Richtarik · Alexander Gasnikov -
2022 Poster: Accelerated Primal-Dual Gradient Method for Smooth and Convex-Concave Saddle-Point Problems with Bilinear Coupling »
Dmitry Kovalev · Alexander Gasnikov · Peter Richtarik -
2022 Poster: Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees »
Aleksandr Beznosikov · Peter Richtarik · Michael Diskin · Max Ryabinin · Alexander Gasnikov -
2022 Poster: Decentralized Local Stochastic Extra-Gradient for Variational Inequalities »
Aleksandr Beznosikov · Pavel Dvurechenskii · Anastasiia Koloskova · Valentin Samokhin · Sebastian Stich · Alexander Gasnikov -
2021 Poster: Distributed Saddle-Point Problems Under Data Similarity »
Aleksandr Beznosikov · Gesualdo Scutari · Alexander Rogozin · Alexander Gasnikov -
2021 Poster: Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks »
Dmitry Kovalev · Elnur Gasanov · Alexander Gasnikov · Peter Richtarik -
2020 Poster: Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping »
Eduard Gorbunov · Marina Danilova · Alexander Gasnikov -
2018 : Poster spotlight »
Tianbao Yang · Pavel Dvurechenskii · Panayotis Mertikopoulos · Hugo Berard -
2016 Poster: Learning Supervised PageRank with Gradient-Based and Gradient-Free Optimization Methods »
Lev Bogolubsky · Pavel Dvurechenskii · Alexander Gasnikov · Gleb Gusev · Yurii Nesterov · Andrei M Raigorodskii · Aleksey Tikhonov · Maksim Zhukovskii