Timezone: »
The stochastic compositional optimization problem covers a wide range of machine learning models, such as sparse additive models and model-agnostic meta-learning. Thus, it is necessary to develop efficient methods for its optimization. Existing methods for the stochastic compositional optimization problem only focus on the single machine scenario, which is far from satisfactory when data are distributed on different devices. To address this problem, we propose novel decentralized stochastic compositional gradient descent methods to efficiently train the large-scale stochastic compositional optimization problem. To the best of our knowledge, our work is the first one facilitating decentralized training for this kind of problem. Furthermore, we provide the convergence analysis for our methods, which shows that the convergence rate of our methods can achieve linear speedup with respect to the number of devices. At last, we apply our decentralized training methods to the model-agnostic meta-learning problem, and the experimental results confirm the superior performance of our methods.
Author Information
Hongchang Gao (Temple University)
Heng Huang (University of Pittsburgh)
More from the Same Authors
-
2022 : FedGRec: Federated Graph Recommender System with Lazy Update of Latent Embeddings »
Junyi Li · Heng Huang -
2022 : Cooperation or Competition: Avoiding Player Domination for Multi-target Robustness by Adaptive Budgets »
Yimu Wang · Dinghuai Zhang · Yihan Wu · Heng Huang · Hongyang Zhang -
2022 Poster: MetricFormer: A Unified Perspective of Correlation Exploring in Similarity Learning »
Jiexi Yan · Erkun Yang · Cheng Deng · Heng Huang -
2022 Poster: Enhanced Bilevel Optimization via Bregman Distance »
Feihu Huang · Junyi Li · Shangqian Gao · Heng Huang -
2021 Poster: Optimal Underdamped Langevin MCMC Method »
Zhengmian Hu · Feihu Huang · Heng Huang -
2021 Poster: SUPER-ADAM: Faster and Universal Framework of Adaptive Gradients »
Feihu Huang · Junyi Li · Heng Huang -
2021 Poster: Efficient Mirror Descent Ascent Methods for Nonsmooth Minimax Problems »
Feihu Huang · Xidong Wu · Heng Huang -
2021 Poster: A Faster Decentralized Algorithm for Nonconvex Minimax Problems »
Wenhan Xian · Feihu Huang · Yanfu Zhang · Heng Huang -
2019 Poster: Curvilinear Distance Metric Learning »
Shuo Chen · Lei Luo · Jian Yang · Chen Gong · Jun Li · Heng Huang -
2018 Poster: Bilevel Distance Metric Learning for Robust Image Recognition »
Jie Xu · Lei Luo · Cheng Deng · Heng Huang -
2018 Poster: Training Neural Networks Using Features Replay »
Zhouyuan Huo · Bin Gu · Heng Huang -
2018 Spotlight: Training Neural Networks Using Features Replay »
Zhouyuan Huo · Bin Gu · Heng Huang -
2017 Poster: Group Sparse Additive Machine »
Hong Chen · Xiaoqian Wang · Cheng Deng · Heng Huang -
2017 Poster: Regularized Modal Regression with Applications in Cognitive Impairment Prediction »
Xiaoqian Wang · Hong Chen · Weidong Cai · Dinggang Shen · Heng Huang -
2017 Poster: Learning A Structured Optimal Bipartite Graph for Co-Clustering »
Feiping Nie · Xiaoqian Wang · Cheng Deng · Heng Huang