1 code implementation • 3 Aug 2022 • Simon Guiroy, Christopher Pal, Gonçalo Mordido, Sarath Chandar
Specifically, we analyze the evolution, during meta-training, of the neural activations at each hidden layer, on a small set of unlabelled support examples from a single task of the target tasks distribution, as this constitutes a minimal and justifiably accessible information from the target problem.
no code implementations • 13 Oct 2021 • Gabriele Prato, Simon Guiroy, Ethan Caballero, Irina Rish, Sarath Chandar
Empirical science of neural scaling laws is a rapidly growing area of significant importance to the future of machine learning, particularly in the light of recent breakthroughs achieved by large-scale pre-trained models such as GPT-3, CLIP and DALL-e.
no code implementations • 29 Sep 2021 • Simon Guiroy, Christopher Pal, Sarath Chandar
To this end, we empirically show that as meta-training progresses, a model's generalization to a target distribution of novel tasks can be estimated by analysing the dynamics of its neural activations.
no code implementations • ICML Workshop LifelongML 2020 • Simon Guiroy, Vikas Verma, Christopher Pal
The study of generalization of neural networks in gradient-based meta-learning has recently great research interest.
no code implementations • 16 Jul 2019 • Simon Guiroy, Vikas Verma, Christopher Pal
We also show that coherence of meta-test gradients, measured by the average inner product between the task-specific gradient vectors evaluated at meta-train solution, is also correlated with generalization.