Search Results for author: Richard E Turner

Found 9 papers, 2 papers with code

On the Efficacy of Differentially Private Few-shot Image Classification

1 code implementation2 Feb 2023 Marlon Tobaben, Aliaksandra Shysheya, John Bronskill, Andrew Paverd, Shruti Tople, Santiago Zanella-Beguelin, Richard E Turner, Antti Honkela

There has been significant recent progress in training differentially private (DP) models which achieve accuracy that approaches the best non-private models.

Federated Learning Few-Shot Image Classification

FiT: Parameter Efficient Few-shot Transfer Learning for Personalized and Federated Image Classification

1 code implementation17 Jun 2022 Aliaksandra Shysheya, John Bronskill, Massimiliano Patacchiola, Sebastian Nowozin, Richard E Turner

Modern deep learning systems are increasingly deployed in situations such as personalization and federated learning where it is necessary to support i) learning on small amounts of data, and ii) communication efficient distributed training protocols.

Federated Learning Few-Shot Learning +2

Practical Conditional Neural Process Via Tractable Dependent Predictions

no code implementations ICLR 2022 Stratis Markou, James Requeima, Wessel Bruinsma, Anna Vaughan, Richard E Turner

Existing approaches which model output dependencies, such as Neural Processes (NPs; Garnelo et al., 2018) or the FullConvGNP (Bruinsma et al., 2021), are either complicated to train or prohibitively expensive.

Decision Making Meta-Learning

Attacking Few-Shot Classifiers with Adversarial Support Poisoning

no code implementations ICML Workshop AML 2021 Elre Talea Oldewage, John F Bronskill, Richard E Turner

This paper examines the robustness of deployed few-shot meta-learning systems when they are fed an imperceptibly perturbed few-shot dataset, showing that the resulting predictions on test inputs can become worse than chance.

Meta-Learning

Attacking Few-Shot Classifiers with Adversarial Support Sets

no code implementations1 Jan 2021 Elre Talea Oldewage, John F Bronskill, Richard E Turner

Few-shot learning systems, especially those based on meta-learning, have recently made significant advances, and are now being considered for real world problems in healthcare, personalization, and science.

Few-Shot Learning General Classification

Marginal Likelihood Gradient for Bayesian Neural Networks

no code implementations pproximateinference AABI Symposium 2021 Marcin B. Tomczak, Richard E Turner

Bayesian learning of neural networks is attractive as it can protecting against over-fitting and provide automatic methods for inferring important hyperparameters by maximizing the marginal probability of the data.

Variational Inference

Combining Variational Continual Learning with FiLM Layers

no code implementations ICML Workshop LifelongML 2020 Noel Loo, Siddharth Swaroop, Richard E Turner

The standard architecture for continual learning is a multi-headed neural network, which has shared body parameters and task-specific heads.

Continual Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.