Cluster Alignment with a Teacher for Unsupervised Domain Adaptation

ICCV 2019  ·  Zhijie Deng, Yucen Luo, Jun Zhu ·

Deep learning methods have shown promise in unsupervised domain adaptation, which aims to leverage a labeled source domain to learn a classifier for the unlabeled target domain with a different distribution. However, such methods typically learn a domain-invariant representation space to match the marginal distributions of the source and target domains, while ignoring their fine-level structures. In this paper, we propose Cluster Alignment with a Teacher (CAT) for unsupervised domain adaptation, which can effectively incorporate the discriminative clustering structures in both domains for better adaptation. Technically, CAT leverages an implicit ensembling teacher model to reliably discover the class-conditional structure in the feature space for the unlabeled target domain. Then CAT forces the features of both the source and the target domains to form discriminative class-conditional clusters and aligns the corresponding clusters across domains. Empirical results demonstrate that CAT achieves state-of-the-art results in several unsupervised domain adaptation scenarios.

PDF Abstract ICCV 2019 PDF ICCV 2019 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Domain Adaptation ImageCLEF-DA rRevGrad+CAT Accuracy 80.7 # 13
Domain Adaptation MNIST-to-USPS rRevGrad+CAT Accuracy 96 # 9
Domain Adaptation Office-31 rRevGrad+CAT Average Accuracy 80.1 # 30
Domain Adaptation SVNH-to-MNIST rRevGrad+CAT Accuracy 98.8 # 3
Domain Adaptation USPS-to-MNIST MCD+CAT Accuracy 96.3 # 10

Methods


No methods listed for this paper. Add relevant methods here