Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Distribution Shifts: New Frontiers with Foundation Models

Simplifying and Stabilizing Model Selection in Unsupervised Domain Adaptation

Dapeng Hu · Romy Luo · Jian Liang · Chuan Sheng Foo

Keywords: [ Unsupervised Domain Adaptation; Unsupervised Model Selection; Unsupervised Hyperparameter Selection ]


Abstract:

Existing model selection methods for unsupervised domain adaptation (UDA) often struggle to maintain stable performance across diverse UDA methods and UDA scenarios, frequently resulting in suboptimal or even the worst hyperparameter choices. This instability limitation poses severe risks to the safe deployment of UDA models in practical scenarios, significantly impairing the practicality and reliability of these selection approaches.To address this challenge, we introduce a novel ensemble-based validation approach called EnsV, aiming to simplify and stabilize model selection in UDA.EnsV relies solely on predictions of unlabeled target data without making any assumptions about distribution shifts, offering high simplicity and versatility. Additionally, EnsV is built upon an off-the-shelf ensemble that is theoretically guaranteed to outperform the worst candidate model, ensuring high stability.In our experiments, we benchmark EnsV against eight competitive model selection approaches, evaluating its performance across 12 UDA methods, 5 diverse UDA benchmarks, and 5 popular UDA scenarios. The results consistently highlight EnsV as a highly simple, versatile, and stable choice for practical model selection in UDA scenarios.

Chat is not available.