Partial Domain Adaptation
19 papers with code • 5 benchmarks • 4 datasets
Partial Domain Adaptation is a transfer learning paradigm, which manages to transfer relevant knowledge from a large-scale source domain to a small-scale target domain.
Source: Deep Residual Correction Network for Partial Domain Adaptation
Libraries
Use these libraries to find Partial Domain Adaptation models and implementationsLatest papers
Selective Partial Domain Adaptation
To solve this problem, we propose a Selective Partial Domain Adaptation (SPDA) method, which selects useful data for the adaptation to the target domain.
OneRing: A Simple Method for Source-free Open-partial Domain Adaptation
In this paper, we investigate Source-free Open-partial Domain Adaptation (SF-OPDA), which addresses the situation where there exist both domain and category shifts between source and target domains.
From Big to Small: Adaptive Learning to Partial-Set Domains
Still, the common requirement of identical class space shared across domains hinders applications of domain adaptation to partial-set domains.
Adversarial Reweighting for Partial Domain Adaptation
To tackle the challenge of negative domain transfer, we propose a novel Adversarial Reweighting (AR) approach that adversarially learns the weights of source domain data to align the source and target domain distributions, and the transferable deep recognition network is learned on the reweighted source domain data.
Implicit Semantic Response Alignment for Partial Domain Adaptation
Partial Domain Adaptation (PDA) addresses the unsupervised domain adaptation problem where the target label space is a subset of the source label space.
Source Class Selection with Label Propagation for Partial Domain Adaptation
The outlier classes can be detected if no target-domain data are labeled as these classes.
Partial Domain Adaptation without Domain Alignment
Considering the difficulty of perfect alignment in solving PDA, we turn to focus on the model smoothness while discard the riskier domain alignment to enhance the adaptability of the model.
Improving Mini-batch Optimal Transport via Partial Transportation
Mini-batch optimal transport (m-OT) has been widely used recently to deal with the memory issue of OT in large-scale applications.
Domain Consensus Clustering for Universal Domain Adaptation
To better exploit the intrinsic structure of the target domain, we propose Domain Consensus Clustering (DCC), which exploits the domain consensus knowledge to discover discriminative clusters on both common samples and private ones.
Unsupervised Domain Adaptation with Progressive Adaptation of Subspaces
Unsupervised Domain Adaptation (UDA) aims to classify unlabeled target domain by transferring knowledge from labeled source domain with domain shift.