On Localized Discrepancy for Domain Adaptation

14 Aug 2020  ·  Yuchen Zhang, Mingsheng Long, Jian-Min Wang, Michael. I. Jordan ·

We propose the discrepancy-based generalization theories for unsupervised domain adaptation. Previous theories introduced distribution discrepancies defined as the supremum over complete hypothesis space. The hypothesis space may contain hypotheses that lead to unnecessary overestimation of the risk bound. This paper studies the localized discrepancies defined on the hypothesis space after localization. First, we show that these discrepancies have desirable properties. They could be significantly smaller than the pervious discrepancies. Their values will be different if we exchange the two domains, thus can reveal asymmetric transfer difficulties. Next, we derive improved generalization bounds with these discrepancies. We show that the discrepancies could influence the rate of the sample complexity. Finally, we further extend the localized discrepancies for achieving super transfer and derive generalization bounds that could be even more sample-efficient on source domain.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here