Uncertainty in Model-Agnostic Meta-Learning using Variational Inference

27 Jul 2019  ·  Cuong Nguyen, Thanh-Toan Do, Gustavo Carneiro ·

We introduce a new, rigorously-formulated Bayesian meta-learning algorithm that learns a probability distribution of model parameter prior for few-shot learning. The proposed algorithm employs a gradient-based variational inference to infer the posterior of model parameters to a new task. Our algorithm can be applied to any model architecture and can be implemented in various machine learning paradigms, including regression and classification. We show that the models trained with our proposed meta-learning algorithm are well calibrated and accurate, with state-of-the-art calibration and classification results on two few-shot classification benchmarks (Omniglot and Mini-ImageNet), and competitive results in a multi-modal task-distribution regression.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Few-Shot Image Classification Mini-Imagenet 5-way (1-shot) VAMPIRE Accuracy 51.54 # 89
Few-Shot Image Classification Mini-Imagenet 5-way (5-shot) VAMPIRE Accuracy 64.31 # 85
Few-Shot Image Classification OMNIGLOT - 1-Shot, 20-way VAMPIRE Accuracy 93.2 # 16
Few-Shot Image Classification OMNIGLOT - 1-Shot, 5-way VAMPIRE Accuracy 98.43 # 11
Few-Shot Image Classification OMNIGLOT - 5-Shot, 20-way VAMPIRE Accuracy 98.52% # 12
Few-Shot Image Classification OMNIGLOT - 5-Shot, 5-way VAMPIRE Accuracy 99.56% # 11
Few-Shot Image Classification Tiered ImageNet 5-way (1-shot) VAMPIRE Accuracy 69.87 # 31
Few-Shot Image Classification Tiered ImageNet 5-way (5-shot) VAMPIRE Accuracy 82.7 # 36

Methods


No methods listed for this paper. Add relevant methods here