Universal Domain Adaptation

8 papers with code • 3 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

OVANet: One-vs-All Network for Universal Domain Adaptation

VisionLearningGroup/OVANet ICCV 2021

In this paper, we propose a method to learn the threshold using source samples and to adapt it to the target domain.

Universal Domain Adaptation through Self Supervision

VisionLearningGroup/DANCE NeurIPS 2020

While some methods address target settings with either partial or open-set categories, they assume that the particular setting is known a priori.

On Universal Black-Box Domain Adaptation

Gorilla-Lab-SCUT/UB2DA 10 Apr 2021

The great promise that UB$^2$DA makes, however, brings significant learning challenges, since domain adaptation can only rely on the predictions of unlabeled target data in a partially overlapped label space, by accessing the interface of source model.

Domain Consensus Clustering for Universal Domain Adaptation

Solacex/Domain-Consensus-Clustering CVPR 2021

To better exploit the intrinsic structure of the target domain, we propose Domain Consensus Clustering (DCC), which exploits the domain consensus knowledge to discover discriminative clusters on both common samples and private ones.

Distance-based Hyperspherical Classification for Multi-source Open-Set Domain Adaptation

silvia1993/HyMOS 5 Jul 2021

Vision systems trained in closed-world scenarios fail when presented with new environmental conditions, new data distributions, and novel classes at deployment time.

VisDA-2021 Competition Universal Domain Adaptation to Improve Performance on Out-of-Distribution Data

VisionLearningGroup/visda21-dev 23 Jul 2021

Progress in machine learning is typically measured by training and testing a model on the same distribution of data, i. e., the same domain.

One Ring to Bring Them All: Towards Open-Set Recognition under Domain Shift

albert0147/onering 7 Jun 2022

In experiments, we show: $\textbf{1)}$ After source training, the resulting source model can get excellent performance for $\textit{open-set single domain generalization}$ and also $\textit{open-set recognition}$ tasks; $\textbf{2)}$ After target adaptation, our method surpasses current UNDA approaches which demand source data during adaptation on several benchmarks.