Rapid Adaptation with Conditionally Shifted Neurons

We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons. We apply this mechanism in the framework of metalearning, where the aim is to replicate some of the flexibility of human learning in machines. Conditionally shifted neurons modify their activation values with task-specific shifts retrieved from a memory module, which is populated rapidly based on limited task experience. On metalearning benchmarks from the vision and language domains, models augmented with conditionally shifted neurons achieve state-of-the-art results.

PDF Abstract ICML 2018 PDF ICML 2018 Abstract
No code implementations yet. Submit your code now
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Few-Shot Image Classification Mini-Imagenet 5-way (1-shot) adaResNet (DF) Accuracy 56.88 # 68
Few-Shot Image Classification Mini-Imagenet 5-way (5-shot) adaResNet (DF) Accuracy 71.94 # 64
Few-Shot Image Classification OMNIGLOT - 1-Shot, 20-way adaCNN (DF) Accuracy 96.12% # 10
Few-Shot Image Classification OMNIGLOT - 1-Shot, 5-way adaCNN (DF) Accuracy 98.42 # 12
Few-Shot Image Classification OMNIGLOT - 5-Shot, 20-way adaCNN (DF) Accuracy 98.43% # 13
Few-Shot Image Classification OMNIGLOT - 5-Shot, 5-way adaCNN (DF) Accuracy 99.37 # 15

Methods