MOST: Multi-Source Domain Adaptation via Optimal Transport for Student-Teacher Learning

Multi-source domain adaptation (DA) is more challenging than conventional DA because the knowledge is transferred from several source domains to a target domain. To this end, we propose in this paper a novel model for multi-source DA using the theory of optimal transport and imitation learning. More specifically, our approach consists of two cooperative agents: a teacher classifier and a student classifier. The teacher classifier is a combined expert that leverages knowledge of domain experts that can be theoretically guaranteed to handle perfectly source examples, while the student classifier acting on the target domain tries to imitate the teacher classifier acting on the source domains. Our rigorous theory developed based on optimal transport makes this cross-domain imitation possible and also helps to mitigate not only the data shift but also the label shift, which are inherently thorny issues in DA research. We conduct comprehensive experiments on real-world datasets to demonstrate the merit of our approach and its optimal transport based imitation learning viewpoint. Experimental results show that our proposed method achieves state-of-the-art performance on benchmark datasets for multi-source domain adaptation including Digits-five, Office-Caltech10, and Office-31 to the best of our knowledge.

Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Multi-Source Unsupervised Domain Adaptation Digits-five MOST Accuracy 95.4 # 1
Multi-Source Unsupervised Domain Adaptation Office-31 MOST Accuracy 86.4 # 1
Multi-Source Unsupervised Domain Adaptation Office-Caltech10 MOST Accuracy 98.1 # 2


No methods listed for this paper. Add relevant methods here