cross-domain few-shot learning

29 papers with code • 1 benchmarks • 1 datasets

Its essence is transfer learning. The model needs to be trained in the source domain and then migrated to the target domain. Compliant with (1) the category in the target domain has never appeared in the source domain (2) the data distribution of the target domain is inconsistent with the source domain (3) each class in the target domain has very few labels

Datasets


Most implemented papers

Cross-domain Few-shot Learning with Task-specific Adapters

google-research/meta-dataset CVPR 2022

In this paper, we look at the problem of cross-domain few-shot classification that aims to learn a classifier from previously unseen classes and domains with few labeled samples.

Self-Supervision Can Be a Good Few-Shot Learner

bbbdylan/unisiam 19 Jul 2022

Specifically, we maximize the mutual information (MI) of instances and their representations with a low-bias MI estimator to perform self-supervised pre-training.

Self-Supervised Learning For Few-Shot Image Classification

Alibaba-AAIG/SSL-FEW-SHOT 14 Nov 2019

In this paper, we proposed to train a more generalized embedding network with self-supervised learning (SSL) which can provide robust representation for downstream tasks by learning from the data itself.

A Broader Study of Cross-Domain Few-Shot Learning

IBM/cdfsl-benchmark ECCV 2020

Extensive experiments on the proposed benchmark are performed to evaluate state-of-art meta-learning approaches, transfer learning approaches, and newer methods for cross-domain few-shot learning.

Cross-Domain Few-Shot Learning by Representation Fusion

ml-jku/chef 13 Oct 2020

On the few-shot datasets miniImagenet and tieredImagenet with small domain shifts, CHEF is competitive with state-of-the-art methods.

Shallow Bayesian Meta Learning for Real-World Few-Shot Recognition

open-debin/bayesian_mqda ICCV 2021

Current state-of-the-art few-shot learners focus on developing effective training procedures for feature representations, before using simple, e. g. nearest centroid, classifiers.

Understanding Cross-Domain Few-Shot Learning Based on Domain Similarity and Few-Shot Difficulty

sungnyun/understanding-cdfsl 1 Feb 2022

This data enables self-supervised pre-training on the target domain, in addition to supervised pre-training on the source domain.

Universal Representations: A Unified Look at Multiple Task and Domain Learning

vico-uoe/universalrepresentations 6 Apr 2022

We propose a unified look at jointly learning multiple vision tasks and visual domains through universal representations, a single deep neural network.

StyleAdv: Meta Style Adversarial Training for Cross-Domain Few-Shot Learning

lovelyqian/styleadv-cdfsl CVPR 2023

Thus, inspired by vanilla adversarial learning, a novel model-agnostic meta Style Adversarial training (StyleAdv) method together with a novel style adversarial attack method is proposed for CD-FSL.

A Transductive Multi-Head Model for Cross-Domain Few-Shot Learning

leezhp1994/TMHFS 8 Jun 2020

The TMHFS method extends the Meta-Confidence Transduction (MCT) and Dense Feature-Matching Networks (DFMN) method [2] by introducing a new prediction head, i. e, an instance-wise global classification network based on semantic information, after the common feature embedding network.