Unsupervised Domain Adaptation By Optimal Transportation Of Clusters Between Domains

29 Sep 2021  ·  Yang Liu, Zhipeng Zhou, Lei Shang, Baigui Sun, Hao Li, Rong Jin ·

Unsupervised domain adaptation (UDA) aims to transfer the knowledge from a labeled source domain to an unlabeled target domain. Typically, to guarantee desirable knowledge transfer, aligning the distribution between source and target domain from a global perspective is widely adopted in UDA. Recent researchers further point out the importance of local-level alignment and borrow the experience from Optimal Transport (OT) theory to construct instance-pair alignment. However, existing OT-based algorithms are limited to resolve class imbalance challenge and require a huge computation cost when considering a large-scale training situation. In this paper, we address these two issues by proposing a Clustering-based Optimal Transport (COT) algorithm, which formulates the alignment procedure as an Optimal Transport problem by capturing the fine-grained attribute alignment. Concretely, COT innovatively designs the loss derived from discrete Kantorovich dual form to construct a mapping between clustering centers in source and target domain, which simultaneously eliminates the negative effect brought by class imbalance and reduces the computation cost on the basis of theoretical analysis. Finally, our COT together with some previous UDA methods achieve superior performance on several benchmarks.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here