cross-domain few-shot learning

15 papers with code • 0 benchmarks • 0 datasets

Its essence is transfer learning. The model needs to be trained in the source domain and then migrated to the target domain. Compliant with (1) the category in the target domain has never appeared in the source domain (2) the data distribution of the target domain is inconsistent with the source domain (3) each class in the target domain has very few labels

Most implemented papers

Self-Supervised Learning For Few-Shot Image Classification

Alibaba-AAIG/SSL-FEW-SHOT 14 Nov 2019

In this paper, we proposed to train a more generalized embedding network with self-supervised learning (SSL) which can provide robust representation for downstream tasks by learning from the data itself.

A Broader Study of Cross-Domain Few-Shot Learning

IBM/cdfsl-benchmark ECCV 2020

Extensive experiments on the proposed benchmark are performed to evaluate state-of-art meta-learning approaches, transfer learning approaches, and newer methods for cross-domain few-shot learning.

Cross-Domain Few-Shot Learning by Representation Fusion

ml-jku/chef 13 Oct 2020

On the few-shot datasets miniImagenet and tieredImagenet with small domain shifts, CHEF is competitive with state-of-the-art methods.

Shallow Bayesian Meta Learning for Real-World Few-Shot Recognition

open-debin/bayesian_mqda ICCV 2021

Current state-of-the-art few-shot learners focus on developing effective training procedures for feature representations, before using simple, e. g. nearest centroid, classifiers.

Cross-domain Few-shot Learning with Task-specific Adapters

google-research/meta-dataset CVPR 2022

In this paper, we look at the problem of cross-domain few-shot classification that aims to learn a classifier from previously unseen classes and domains with few labeled samples.

Universal Representations: A Unified Look at Multiple Task and Domain Learning

vico-uoe/universalrepresentations 6 Apr 2022

We propose a unified look at jointly learning multiple vision tasks and visual domains through universal representations, a single deep neural network.

A Transductive Multi-Head Model for Cross-Domain Few-Shot Learning

leezhp1994/TMHFS 8 Jun 2020

The TMHFS method extends the Meta-Confidence Transduction (MCT) and Dense Feature-Matching Networks (DFMN) method [2] by introducing a new prediction head, i. e, an instance-wise global classification network based on semantic information, after the common feature embedding network.

Modular Adaptation for Cross-Domain Few-Shot Learning

frkl/modular-adaptation 1 Apr 2021

Adapting pre-trained representations has become the go-to recipe for learning new downstream tasks with limited examples.

DAMSL: Domain Agnostic Meta Score-based Learning

johncai117/DAMSL 6 Jun 2021

In this paper, we propose Domain Agnostic Meta Score-based Learning (DAMSL), a novel, versatile and highly effective solution that delivers significant out-performance over state-of-the-art methods for cross-domain few-shot learning.

Dynamic Distillation Network for Cross-Domain Few-Shot Recognition with Unlabeled Data

asrafulashiq/dynamic-cdfsl NeurIPS 2021

As the base dataset and unlabeled dataset are from different domains, projecting the target images in the class-domain of the base dataset with a fixed pretrained model might be sub-optimal.