Search Results for author: Eleni Triantafillou

Found 15 papers, 6 papers with code

Data Selection for Transfer Unlearning

no code implementations16 May 2024 Nazanin Mohammadi Sepahvand, Vincent Dumoulin, Eleni Triantafillou, Gintare Karolina Dziugaite

In this work, we advocate for a relaxed definition of unlearning that does not address privacy applications but targets a scenario where a data owner withdraws permission of use of their data for training purposes.

Machine Unlearning

BIRB: A Generalization Benchmark for Information Retrieval in Bioacoustics

1 code implementation12 Dec 2023 Jenny Hamer, Eleni Triantafillou, Bart van Merriënboer, Stefan Kahl, Holger Klinck, Tom Denton, Vincent Dumoulin

The ability for a machine learning model to cope with differences in training and deployment conditions--e. g. in the presence of distribution shift or the generalization to new classes altogether--is crucial for real-world use cases.

Information Retrieval Representation Learning +1

Towards Unbounded Machine Unlearning

1 code implementation NeurIPS 2023 Meghdad Kurmanji, Peter Triantafillou, Jamie Hayes, Eleni Triantafillou

This paper is the first, to our knowledge, to study unlearning for different applications (RB, RC, UP), with the view that each has its own desiderata, definitions for `forgetting' and associated metrics for forget quality.

Inference Attack Machine Unlearning +1

In Search for a Generalizable Method for Source Free Domain Adaptation

no code implementations13 Feb 2023 Malik Boudiaf, Tom Denton, Bart van Merriënboer, Vincent Dumoulin, Eleni Triantafillou

Source-free domain adaptation (SFDA) is compelling because it allows adapting an off-the-shelf model to a new domain using only unlabelled data.

Source-Free Domain Adaptation

Learning a Universal Template for Few-shot Dataset Generalization

1 code implementation14 May 2021 Eleni Triantafillou, Hugo Larochelle, Richard Zemel, Vincent Dumoulin

Few-shot dataset generalization is a challenging variant of the well-studied few-shot classification problem where a diverse training set of several datasets is given, for the purpose of training an adaptable model that can then learn classes from new datasets using only a few examples.

Learning Flexible Classifiers with Shot-CONditional Episodic (SCONE) Training

no code implementations1 Jan 2021 Eleni Triantafillou, Vincent Dumoulin, Hugo Larochelle, Richard Zemel

We discover that fine-tuning on episodes of a particular shot can specialize the pre-trained model to solving episodes of that shot at the expense of performance on other shots, in agreement with a trade-off recently observed in the context of end-to-end episodic training.

Classification General Classification

Exploring representation learning for flexible few-shot tasks

no code implementations1 Jan 2021 Mengye Ren, Eleni Triantafillou, Kuan-Chieh Wang, James Lucas, Jake Snell, Xaq Pitkow, Andreas S. Tolias, Richard Zemel

In this work, we consider a realistic setting where the relationship between examples can change from episode to episode depending on the task context, which is not given to the learner.

Few-Shot Learning Representation Learning

Probing Few-Shot Generalization with Attributes

no code implementations10 Dec 2020 Mengye Ren, Eleni Triantafillou, Kuan-Chieh Wang, James Lucas, Jake Snell, Xaq Pitkow, Andreas S. Tolias, Richard Zemel

Despite impressive progress in deep learning, generalizing far beyond the training distribution is an important open challenge.

Attribute Few-Shot Learning +1

Out-of-distribution Detection in Few-shot Classification

no code implementations25 Sep 2019 Kuan-Chieh Wang, Paul Vicol, Eleni Triantafillou, Chia-Cheng Liu, Richard Zemel

In this work, we propose tasks for out-of-distribution detection in the few-shot setting and establish benchmark datasets, based on four popular few-shot classification datasets.

Classification Out-of-Distribution Detection

Meta-Learning for Semi-Supervised Few-Shot Classification

9 code implementations ICLR 2018 Mengye Ren, Eleni Triantafillou, Sachin Ravi, Jake Snell, Kevin Swersky, Joshua B. Tenenbaum, Hugo Larochelle, Richard S. Zemel

To address this paradigm, we propose novel extensions of Prototypical Networks (Snell et al., 2017) that are augmented with the ability to use unlabeled examples when producing prototypes.

General Classification Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.