Bridging Multi-Task Learning and Meta-Learning: Towards Efficient Training and Effective Adaptation

16 Jun 2021  ·  Haoxiang Wang, Han Zhao, Bo Li ·

Multi-task learning (MTL) aims to improve the generalization of several related tasks by learning them jointly. As a comparison, in addition to the joint training scheme, modern meta-learning allows unseen tasks with limited labels during the test phase, in the hope of fast adaptation over them. Despite the subtle difference between MTL and meta-learning in the problem formulation, both learning paradigms share the same insight that the shared structure between existing training tasks could lead to better generalization and adaptation. In this paper, we take one important step further to understand the close connection between these two learning paradigms, through both theoretical analysis and empirical investigation. Theoretically, we first demonstrate that MTL shares the same optimization formulation with a class of gradient-based meta-learning (GBML) algorithms. We then prove that for over-parameterized neural networks with sufficient depth, the learned predictive functions of MTL and GBML are close. In particular, this result implies that the predictions given by these two models are similar over the same unseen task. Empirically, we corroborate our theoretical findings by showing that, with proper implementation, MTL is competitive against state-of-the-art GBML algorithms on a set of few-shot image classification benchmarks. Since existing GBML algorithms often involve costly second-order bi-level optimization, our first-order MTL method is an order of magnitude faster on large-scale datasets such as mini-ImageNet. We believe this work could help bridge the gap between these two learning paradigms, and provide a computationally efficient alternative to GBML that also supports fast task adaptation.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Few-Shot Image Classification CIFAR-FS 5-way (1-shot) Multi-Task Learning Accuracy 69.5 # 32
Few-Shot Image Classification CIFAR-FS 5-way (5-shot) Multi-Task Learning Accuracy 84.1 # 32
Few-Shot Image Classification FC100 5-way (1-shot) Multi-Task Learning Accuracy 42.4 # 17
Few-Shot Image Classification FC100 5-way (5-shot) Multi-Task Learning Accuracy 57.7 # 18
Few-Shot Image Classification Mini-Imagenet 5-way (1-shot) Multi-Task Learning Accuracy 59.84 # 69
Few-Shot Image Classification Mini-Imagenet 5-way (5-shot) Multi-Task Learning Accuracy 77.72 # 57
Few-Shot Image Classification Tiered ImageNet 5-way (1-shot) Multi-Task Learning Accuracy 67.11 # 39
Few-Shot Image Classification Tiered ImageNet 5-way (5-shot) Multi-Task Learning Accuracy 83.69 # 33

Methods


No methods listed for this paper. Add relevant methods here