Partial Domain Adaptation

18 papers with code • 5 benchmarks • 5 datasets

Partial Domain Adaptation is a transfer learning paradigm, which manages to transfer relevant knowledge from a large-scale source domain to a small-scale target domain.

Source: Deep Residual Correction Network for Partial Domain Adaptation

Libraries

Use these libraries to find Partial Domain Adaptation models and implementations

Most implemented papers

Larger Norm More Transferable: An Adaptive Feature Norm Approach for Unsupervised Domain Adaptation

jihanyang/AFN ICCV 2019

Domain adaptation enables the learner to safely generalize into novel environments by mitigating domain shifts across distributions.

Minimum Class Confusion for Versatile Domain Adaptation

thuml/Versatile-Domain-Adaptation ECCV 2020

It can be characterized as (1) a non-adversarial DA method without explicitly deploying domain alignment, enjoying faster convergence speed; (2) a versatile approach that can handle four existing scenarios: Closed-Set, Partial-Set, Multi-Source, and Multi-Target DA, outperforming the state-of-the-art methods in these scenarios, especially on one of the largest and hardest datasets to date (7. 3% on DomainNet).

Importance Weighted Adversarial Nets for Partial Domain Adaptation

thuml/Transfer-Learning-Library CVPR 2018

This paper proposes an importance weighted adversarial nets-based method for unsupervised domain adaptation, specific for partial domain adaptation where the target domain has less number of classes compared to the source domain.

Partial Adversarial Domain Adaptation

thuml/PADA ECCV 2018

We present Partial Adversarial Domain Adaptation (PADA), which simultaneously alleviates negative transfer by down-weighing the data of outlier source classes for training both source classifier and domain adversary, and promotes positive transfer by matching the feature distributions in the shared label space.

Improving Mini-batch Optimal Transport via Partial Transportation

ut-austin-data-science-group/mini-batch-ot 22 Aug 2021

Mini-batch optimal transport (m-OT) has been widely used recently to deal with the memory issue of OT in large-scale applications.

007: Democratically Finding The Cause of Packet Drops

behnazak/Vigil-007SourceCode 20 Feb 2018

Network failures continue to plague datacenter operators as their symptoms may not have direct correlation with where or why they occur.

Learning to Transfer Examples for Partial Domain Adaptation

thuml/ETN CVPR 2019

Under the condition that target labels are unknown, the key challenge of PDA is how to transfer relevant examples in the shared classes to promote positive transfer, and ignore irrelevant ones in the specific classes to mitigate negative transfer.

Universal Domain Adaptation through Self Supervision

VisionLearningGroup/DANCE NeurIPS 2020

While some methods address target settings with either partial or open-set categories, they assume that the particular setting is known a priori.

Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation

tim-learn/SHOT ICML 2020

Unsupervised domain adaptation (UDA) aims to leverage the knowledge learned from a labeled source dataset to solve similar tasks in a new unlabeled domain.

A Balanced and Uncertainty-aware Approach for Partial Domain Adaptation

tim-learn/BA3US ECCV 2020

On one hand, negative transfer results in misclassification of target samples to the classes only present in the source domain.