no code implementations • 21 Jan 2024 • Kiyoon Kim, Shreyank N Gowda, Panagiotis Eustratiadis, Antreas Antoniou, Robert B Fisher
More precisely, we created dataset splits of HMDB-51 or UCF-101 for training, and Kinetics-400 for testing, using the subset of the classes that are overlapping in both train and test datasets.
no code implementations • 27 Oct 2023 • Fady Rezk, Antreas Antoniou, Henry Gouk, Timothy Hospedales
We analyze VeLO (versatile learned optimizer), the largest scale attempt to train a general purpose "foundational" optimizer to date.
no code implementations • 29 Sep 2023 • Alessandro Fontanella, Wenwen Li, Grant Mair, Antreas Antoniou, Eleanor Platt, Paul Armitage, Emanuele Trucco, Joanna Wardlaw, Amos Storkey
DL methods can be designed for AIS lesion detection on CT using the vast quantities of routinely-collected CT brain scan data.
no code implementations • 26 Sep 2023 • Alessandro Fontanella, Wenwen Li, Grant Mair, Antreas Antoniou, Eleanor Platt, Chloe Martin, Paul Armitage, Emanuele Trucco, Joanna Wardlaw, Amos Storkey
Despite the large amount of brain CT data generated in clinical practice, the availability of CT datasets for deep learning (DL) research is currently limited.
1 code implementation • 27 Mar 2023 • Alessandro Fontanella, Antreas Antoniou, Wenwen Li, Joanna Wardlaw, Grant Mair, Emanuele Trucco, Amos Storkey
We investigate the best way to generate the saliency maps employed in our architecture and propose a way to obtain them from adversarially generated counterfactual images.
1 code implementation • 30 Jan 2023 • Adam Jelley, Amos Storkey, Antreas Antoniou, Sam Devlin
We evaluate our approach on an adaptation of a comprehensive few-shot learning benchmark, Meta-Dataset, and demonstrate the benefits of POEM over other meta-learning methods at representation learning from partial observations.
2 code implementations • 15 Apr 2020 • Antreas Antoniou, Massimiliano Patacchiola, Mateusz Ochal, Amos Storkey
Both few-shot and continual learning have seen substantial progress in the last years due to the introduction of proper benchmarks.
1 code implementation • 11 Apr 2020 • Timothy Hospedales, Antreas Antoniou, Paul Micaelli, Amos Storkey
We survey promising applications and successes of meta-learning such as few-shot learning and reinforcement learning.
1 code implementation • NeurIPS 2019 • Antreas Antoniou, Amos J. Storkey
In this paper, we propose a framework called \emph{Self-Critique and Adapt} or SCA.
1 code implementation • 24 May 2019 • Antreas Antoniou, Amos Storkey
In this paper, we propose a framework called Self-Critique and Adapt or SCA, which learns to learn a label-free loss function, parameterized as a neural network.
Ranked #23 on Few-Shot Image Classification on CUB 200 5-way 5-shot
no code implementations • 26 Feb 2019 • Antreas Antoniou, Amos Storkey
The field of few-shot learning has been laboriously explored in the supervised setting, where per-class labels are available.
Data Augmentation Unsupervised Few-Shot Image Classification +1
no code implementations • 1 Nov 2018 • Antreas Antoniou, Agnieszka Słowik, Elliot J. Crowley, Amos Storkey
Despite their impressive performance in many tasks, deep neural networks often struggle at relational reasoning.
9 code implementations • ICLR 2019 • Antreas Antoniou, Harrison Edwards, Amos Storkey
The field of few-shot learning has recently seen substantial advancements.
2 code implementations • 2 Oct 2018 • Luke N. Darlow, Elliot J. Crowley, Antreas Antoniou, Amos J. Storkey
In this brief technical report we introduce the CINIC-10 dataset as a plug-in extended alternative for CIFAR-10.
Ranked #6 on Image Classification on CINIC-10
7 code implementations • ICLR 2018 • Antreas Antoniou, Amos Storkey, Harrison Edwards
The model, based on image conditional Generative Adversarial Networks, takes data from a source domain and learns to take any data item and generalise it to generate other within-class data items.