Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace

ICML 2018  ·  Yoonho Lee, Seungjin Choi ·

Gradient-based meta-learning methods leverage gradient descent to learn the commonalities among various tasks. While previous such methods have been successful in meta-learning tasks, they resort to simple gradient descent during meta-testing. Our primary contribution is the {\em MT-net}, which enables the meta-learner to learn on each layer's activation space a subspace that the task-specific learner performs gradient descent on. Additionally, a task-specific learner of an {\em MT-net} performs gradient descent with respect to a meta-learned distance metric, which warps the activation space to be more sensitive to task identity. We demonstrate that the dimension of this learned subspace reflects the complexity of the task-specific learner's adaptation task, and also that our model is less sensitive to the choice of initial learning rates than previous gradient-based meta-learning methods. Our method achieves state-of-the-art or comparable performance on few-shot classification and regression tasks.

PDF Abstract ICML 2018 PDF ICML 2018 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Few-Shot Image Classification Mini-Imagenet 5-way (1-shot) MT-Net Accuracy 51.7 # 87
Few-Shot Image Classification OMNIGLOT - 1-Shot, 20-way MT-net Accuracy 96.2% # 9
Few-Shot Image Classification OMNIGLOT - 1-Shot, 5-way MT-net Accuracy 99.5 # 5

Methods


No methods listed for this paper. Add relevant methods here