Timezone: »

UMIX: Improving Importance Weighting for Subpopulation Shift via Uncertainty-Aware Mixup
Zongbo Han · Zhipeng Liang · Fan Yang · Liu Liu · Lanqing Li · Yatao Bian · Peilin Zhao · Bingzhe Wu · Changqing Zhang · Jianhua Yao

Tue Nov 29 09:00 AM -- 11:00 AM (PST) @ Hall J #128

Subpopulation shift widely exists in many real-world machine learning applications, referring to the training and test distributions containing the same subpopulation groups but varying in subpopulation frequencies. Importance reweighting is a normal way to handle the subpopulation shift issue by imposing constant or adaptive sampling weights on each sample in the training dataset. However, some recent studies have recognized that most of these approaches fail to improve the performance over empirical risk minimization especially when applied to over-parameterized neural networks. In this work, we propose a simple yet practical framework, called uncertainty-aware mixup (UMIX), to mitigate the overfitting issue in over-parameterized models by reweighting the ''mixed'' samples according to the sample uncertainty. The training-trajectories-based uncertainty estimation is equipped in the proposed UMIX for each sample to flexibly characterize the subpopulation distribution. We also provide insightful theoretical analysis to verify that UMIX achieves better generalization bounds over prior works. Further, we conduct extensive empirical studies across a wide range of tasks to validate the effectiveness of our method both qualitatively and quantitatively. Code is available at https://github.com/TencentAILabHealthcare/UMIX.

Author Information

Zongbo Han (Tianjin University)
Zhipeng Liang (Hong Kong University of Science and Technology)
Fan Yang (Tsinghua University, Tsinghua University)
Liu Liu (Tencent AI Lab)
Lanqing Li (Tencent AI Lab)
Yatao Bian (Tencent AI Lab)
Peilin Zhao (Tencent AI Lab)
Bingzhe Wu (Peeking University)
Changqing Zhang (Tianjin University)
Jianhua Yao (National Institutes of Health)

More from the Same Authors