Search Results for author: Luisa M Zintgraf

Found 2 papers, 1 papers with code

Prospect Pruning: Finding Trainable Weights at Initialization using Meta-Gradients

1 code implementation ICLR 2022 Milad Alizadeh, Shyam A. Tailor, Luisa M Zintgraf, Joost van Amersfoort, Sebastian Farquhar, Nicholas Donald Lane, Yarin Gal

Pruning neural networks at initialization would enable us to find sparse models that retain the accuracy of the original network while consuming fewer computational resources for training and inference.

CAML: Fast Context Adaptation via Meta-Learning

no code implementations27 Sep 2018 Luisa M Zintgraf, Kyriacos Shiarlis, Vitaly Kurin, Katja Hofmann, Shimon Whiteson

We propose CAML, a meta-learning method for fast adaptation that partitions the model parameters into two parts: context parameters that serve as additional input to the model and are adapted on individual tasks, and shared parameters that are meta-trained and shared across tasks.

Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.