Timezone: »
Universal Domain Adaptation (UniDA) aims to transfer knowledge from a source domain to a target domain without any constraints on label sets. Since both domains may hold private classes, identifying target common samples for domain alignment is an essential issue in UniDA. Most existing methods require manually specified or hand-tuned threshold values to detect common samples thus they are hard to extend to more realistic UniDA because of the diverse ratios of common classes. Moreover, they cannot recognize different categories among target-private samples as these private samples are treated as a whole. In this paper, we propose to use Optimal Transport (OT) to handle these issues under a unified framework, namely UniOT. First, an OT-based partial alignment with adaptive filling is designed to detect common classes without any predefined threshold values for realistic UniDA. It can automatically discover the intrinsic difference between common and private classes based on the statistical information of the assignment matrix obtained from OT. Second, we propose an OT-based target representation learning that encourages both global discrimination and local consistency of samples to avoid the over-reliance on the source. Notably, UniOT is the first method with the capability to automatically discover and recognize private categories in the target domain for UniDA. Accordingly, we introduce a new metric H^3-score to evaluate the performance in terms of both accuracy of common samples and clustering performance of private ones. Extensive experiments clearly demonstrate the advantages of UniOT over a wide range of state-of-the-art methods in UniDA.
Author Information
Wanxing Chang (ShanghaiTech University)
Ye Shi (ShanghaiTech University)
Hoang Tuan (University of Technology Sydney)
Jingya Wang (ShanghaiTech University)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 Poster: Unified Optimal Transport Framework for Universal Domain Adaptation »
Wed. Nov 30th 05:00 -- 07:00 PM Room Hall J #134
More from the Same Authors
-
2023 Poster: CSOT: Curriculum and Structure-Aware Optimal Transport for Learning with Noisy Labels »
Wanxing Chang · Ye Shi · Jingya Wang -
2023 Poster: Reduced Policy Optimization for Continuous Control with Hard Constraints »
Shutong Ding · Jingya Wang · Yali Du · Ye Shi -
2023 Poster: Fed-CO$_{2}$: Cooperation of Online and Offline Models for Severe Data Heterogeneity in Federated Learning »
Zhongyi Cai · Ye Shi · Wei Huang · Jingya Wang -
2023 Poster: Two Sides of The Same Coin: Deep Equilibrium Models and Neural ODEs via Homotopy Continuation »
Shutong Ding · Tianyu Cui · Jingya Wang · Ye Shi -
2023 Poster: Contextually Affinitive Neighborhood Refinery for Deep Clustering »
Chunlin Yu · Ye Shi · Jingya Wang -
2022 Spotlight: Lightning Talks 3A-1 »
Shu Ding · Wanxing Chang · Jiyang Guan · Mouxiang Chen · Guan Gui · Yue Tan · Shiyun Lin · Guodong Long · Yuze Han · Wei Wang · Zhen Zhao · Ye Shi · Jian Liang · Chenghao Liu · Lei Qi · Ran He · Jie Ma · Zemin Liu · Xiang Li · Hoang Tuan · Luping Zhou · Zhihua Zhang · Jianling Sun · Jingya Wang · LU LIU · Tianyi Zhou · Lei Wang · Jing Jiang · Yinghuan Shi