Semi-Supervised Domain Generalization
8 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Semi-Supervised Domain Generalization
Libraries
Use these libraries to find Semi-Supervised Domain Generalization models and implementationsMost implemented papers
Semi-Supervised Domain Generalization with Stochastic StyleMatch
We find that the DG methods, which by design are unable to handle unlabeled data, perform poorly with limited labels in SSDG; the SSL methods, especially FixMatch, obtain much better results but are still far away from the basic vanilla model trained using full labels.
MixStyle Neural Networks for Domain Generalization and Adaptation
MixStyle is easy to implement with a few lines of code, does not require modification to training objectives, and can fit a variety of learning paradigms including supervised domain generalization, semi-supervised domain generalization, and unsupervised domain adaptation.
Semi-Supervised Domain Generalization with Evolving Intermediate Domain
From this perspective, we introduce a novel paradigm of DG, termed as Semi-Supervised Domain Generalization (SSDG), to explore how the labeled and unlabeled source domains can interact, and establish two settings, including the close-set and open-set SSDG.
Semi-Supervised Domain Generalization for Cardiac Magnetic Resonance Image Segmentation with High Quality Pseudo Labels
Our main goal is to improve the quality of pseudo labels under extreme MRI Analysis with various domains.
Semi-Supervised Domain Generalization for Object Detection via Language-Guided Feature Alignment
Existing domain adaptation (DA) and generalization (DG) methods in object detection enforce feature alignment in the visual space but face challenges like object appearance variability and scene complexity, which make it difficult to distinguish between objects and achieve accurate detection.
Towards Generic Semi-Supervised Framework for Volumetric Medical Image Segmentation
As a result, there is growing interest in using semi-supervised learning (SSL) techniques to train models with limited labeled data.
Improving Pseudo-labelling and Enhancing Robustness for Semi-Supervised Domain Generalization
A key challenge, faced by the best-performing SSL-based SSDG methods, is selecting accurate pseudo-labels under multiple domain shifts and reducing overfitting to source domains under limited labels.
Towards Generalizing to Unseen Domains with Few Labels
Existing domain generalization (DG) methods which are unable to exploit unlabeled data perform poorly compared to semi-supervised learning (SSL) methods under SSDG setting.