Universal Domain Adaptation
8 papers with code • 3 benchmarks • 3 datasets
Most implemented papers
OVANet: One-vs-All Network for Universal Domain Adaptation
In this paper, we propose a method to learn the threshold using source samples and to adapt it to the target domain.
Universal Domain Adaptation through Self Supervision
While some methods address target settings with either partial or open-set categories, they assume that the particular setting is known a priori.
On Universal Black-Box Domain Adaptation
The great promise that UB$^2$DA makes, however, brings significant learning challenges, since domain adaptation can only rely on the predictions of unlabeled target data in a partially overlapped label space, by accessing the interface of source model.
Domain Consensus Clustering for Universal Domain Adaptation
To better exploit the intrinsic structure of the target domain, we propose Domain Consensus Clustering (DCC), which exploits the domain consensus knowledge to discover discriminative clusters on both common samples and private ones.
Distance-based Hyperspherical Classification for Multi-source Open-Set Domain Adaptation
Vision systems trained in closed-world scenarios fail when presented with new environmental conditions, new data distributions, and novel classes at deployment time.
VisDA-2021 Competition Universal Domain Adaptation to Improve Performance on Out-of-Distribution Data
Progress in machine learning is typically measured by training and testing a model on the same distribution of data, i. e., the same domain.
One Ring to Bring Them All: Towards Open-Set Recognition under Domain Shift
In experiments, we show: $\textbf{1)}$ After source training, the resulting source model can get excellent performance for $\textit{open-set single domain generalization}$ and also $\textit{open-set recognition}$ tasks; $\textbf{2)}$ After target adaptation, our method surpasses current UNDA approaches which demand source data during adaptation on several benchmarks.