MELR: Meta-Learning via Modeling Episode-Level Relationships for Few-Shot Learning

ICLR 2021  ·  Nanyi Fei, Zhiwu Lu, Tao Xiang, Songfang Huang ·

Most recent few-shot learning (FSL) approaches are based on episodic training whereby each episode samples few training instances (shots) per class to imitate the test condition. However, this strict adhering to test condition has a negative side effect, that is, the trained model is susceptible to the poor sampling of few shots. In this work, for the first time, this problem is addressed by exploiting inter-episode relationships. Specifically, a novel meta-learning via modeling episode-level relationships (MELR) framework is proposed. By sampling two episodes containing the same set of classes, MELR is designed to ensure that the meta-learned model is robust against the presence of poorly-sampled shots in one episode or both. This is achieved through two key components: (1) a Cross-Episode Attention Module (CEAM) to alleviate the effects of poorly-sampled shots, and (2) a Cross-Episode Consistency Regularization (CECR) to enforce that the two classifiers learned from the two episodes are consistent even when there are unrepresentative instances. Extensive experiments on two benchmarks with three different backbones show that our MELR achieves the new state-of-the-art.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here