Semi-supervised Domain Adaptation

51 papers with code • 1 benchmarks • 1 datasets

This task has no description! Would you like to contribute one?


Use these libraries to find Semi-supervised Domain Adaptation models and implementations
3 papers


Most implemented papers

AdaMatch: A Unified Approach to Semi-Supervised Learning and Domain Adaptation

google-research/adamatch ICLR 2022

We extend semi-supervised learning to the problem of domain adaptation to learn significantly higher-accuracy models that train on one data distribution and test on a different one.

Semi-supervised Domain Adaptation via Minimax Entropy

VisionLearningGroup/SSDA_MME ICCV 2019

Contemporary domain adaptation methods are very effective at aligning feature distributions of source and target domains without any target supervision.

Reducing Domain Gap by Reducing Style Bias

facebookresearch/DomainBed CVPR 2021

Convolutional Neural Networks (CNNs) often fail to maintain their performance when they confront new test domains, which is known as the problem of domain shift.

Cross-modal Learning for Domain Adaptation in 3D Semantic Segmentation

valeoai/xmuda 18 Jan 2021

Domain adaptation is an important task to enable learning when labels are scarce.

Multi-Source Domain Adaptation and Semi-Supervised Domain Adaptation with Focus on Visual Domain Adaptation Challenge 2019

Panda-Peter/visda2019-multisource 8 Oct 2019

Semi-Supervised Domain Adaptation: For this task, we adopt a standard self-learning framework to construct a classifier based on the labeled source and target data, and generate the pseudo labels for unlabeled target data.

Attract, Perturb, and Explore: Learning a Feature Alignment Network for Semi-supervised Domain Adaptation

tkkim93/ape ECCV 2020

Finally, the exploration scheme locally aligns features in a class-wise manner complementary to the attraction scheme by selectively aligning unlabeled target features complementary to the perturbation scheme.

Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation

lijichang/CVPR2021-SSDA CVPR 2021

Pseudo labeling expands the number of ``labeled" samples in each class in the target domain, and thus produces a more robust and powerful cluster core for each class to facilitate adversarial learning.

Probabilistic Contrastive Learning for Domain Adaptation

ljjcoder/PCL 11 Nov 2021

However, it is undesirably observed that the standard contrastive paradigm (features+$\ell_{2}$ normalization) only brings little help for domain adaptation.

Truly Generalizable Radiograph Segmentation with Conditional Domain Adaptation

hugo-oliveira/CoDAGANs 16 Jan 2019

We merge these unsupervised networks with supervised deep semantic segmentation architectures in order to create a semi-supervised method capable of learning from both unlabeled and labeled data, whenever labeling is available.