Search Results for author: Vincent Dumoulin

Found 33 papers, 22 papers with code

Data Selection for Transfer Unlearning

no code implementations16 May 2024 Nazanin Mohammadi Sepahvand, Vincent Dumoulin, Eleni Triantafillou, Gintare Karolina Dziugaite

In this work, we advocate for a relaxed definition of unlearning that does not address privacy applications but targets a scenario where a data owner withdraws permission of use of their data for training purposes.

Machine Unlearning

BIRB: A Generalization Benchmark for Information Retrieval in Bioacoustics

3 code implementations12 Dec 2023 Jenny Hamer, Eleni Triantafillou, Bart van Merriënboer, Stefan Kahl, Holger Klinck, Tom Denton, Vincent Dumoulin

The ability for a machine learning model to cope with differences in training and deployment conditions--e. g. in the presence of distribution shift or the generalization to new classes altogether--is crucial for real-world use cases.

Information Retrieval Representation Learning +1

A density estimation perspective on learning from pairwise human preferences

1 code implementation23 Nov 2023 Vincent Dumoulin, Daniel D. Johnson, Pablo Samuel Castro, Hugo Larochelle, Yann Dauphin

Learning from human feedback (LHF) -- and in particular learning from pairwise preferences -- has recently become a crucial ingredient in training large language models (LLMs), and has been the subject of much research.

Density Estimation

Active learning for fast and slow modeling attacks on Arbiter PUFs

no code implementations25 Aug 2023 Vincent Dumoulin, Wenjing Rao, Natasha Devroye

In most modeling attacks, a random subset of challenge-response-pairs (CRPs) are used as the labeled data for the machine learning algorithm.

Active Learning

In Search for a Generalizable Method for Source Free Domain Adaptation

no code implementations13 Feb 2023 Malik Boudiaf, Tom Denton, Bart van Merriënboer, Vincent Dumoulin, Eleni Triantafillou

Source-free domain adaptation (SFDA) is compelling because it allows adapting an off-the-shelf model to a new domain using only unlabelled data.

Source-Free Domain Adaptation

Proper Reuse of Image Classification Features Improves Object Detection

1 code implementation CVPR 2022 Cristina Vasconcelos, Vighnesh Birodkar, Vincent Dumoulin

A common practice in transfer learning is to initialize the downstream model weights by pre-training on a data-abundant upstream task.

Classification Image Classification +4

Head2Toe: Utilizing Intermediate Representations for Better Transfer Learning

1 code implementation10 Jan 2022 Utku Evci, Vincent Dumoulin, Hugo Larochelle, Michael C. Mozer

We propose a method, Head-to-Toe probing (Head2Toe), that selects features from all layers of the source model to train a classification head for the target-domain.

Transfer Learning

Head2Toe: Utilizing Intermediate Representations for Better OOD Generalization

no code implementations29 Sep 2021 Utku Evci, Vincent Dumoulin, Hugo Larochelle, Michael Curtis Mozer

We propose a method, Head-to-Toe probing (Head2Toe), that selects features from all layers of the source model to train a classification head for the target-domain.

Transfer Learning

Composing Features: Compositional Model Augmentation for Steerability of Music Transformers

no code implementations29 Sep 2021 Halley Young, Vincent Dumoulin, Pablo Samuel Castro, Jesse Engel, Cheng-Zhi Anna Huang

To tackle the combinatorial nature of composing features, we propose a compositional approach to steering music transformers, building on lightweight fine-tuning methods such as prefix tuning and bias tuning.

Impact of Aliasing on Generalization in Deep Convolutional Networks

no code implementations ICCV 2021 Cristina Vasconcelos, Hugo Larochelle, Vincent Dumoulin, Rob Romijnders, Nicolas Le Roux, Ross Goroshin

We investigate the impact of aliasing on generalization in Deep Convolutional Networks and show that data augmentation schemes alone are unable to prevent it due to structural limitations in widely used architectures.

Data Augmentation Few-Shot Learning +1

Domain Conditional Predictors for Domain Adaptation

1 code implementation25 Jun 2021 Joao Monteiro, Xavier Gibert, Jianqiao Feng, Vincent Dumoulin, Dar-Shyang Lee

Domain adaptation approaches thus appeared as a useful framework yielding extra flexibility in that distinct train and test data distributions are supported, provided that other assumptions are satisfied such as covariate shift, which expects the conditional distributions over labels to be independent of the underlying data distribution.

Domain Adaptation

Learning a Universal Template for Few-shot Dataset Generalization

1 code implementation14 May 2021 Eleni Triantafillou, Hugo Larochelle, Richard Zemel, Vincent Dumoulin

Few-shot dataset generalization is a challenging variant of the well-studied few-shot classification problem where a diverse training set of several datasets is given, for the purpose of training an adaptable model that can then learn classes from new datasets using only a few examples.

Comparing Transfer and Meta Learning Approaches on a Unified Few-Shot Classification Benchmark

1 code implementation6 Apr 2021 Vincent Dumoulin, Neil Houlsby, Utku Evci, Xiaohua Zhai, Ross Goroshin, Sylvain Gelly, Hugo Larochelle

To bridge this gap, we perform a cross-family study of the best transfer and meta learners on both a large-scale meta-learning benchmark (Meta-Dataset, MD), and a transfer learning benchmark (Visual Task Adaptation Benchmark, VTAB).

Few-Shot Learning General Classification +1

Learning Flexible Classifiers with Shot-CONditional Episodic (SCONE) Training

no code implementations1 Jan 2021 Eleni Triantafillou, Vincent Dumoulin, Hugo Larochelle, Richard Zemel

We discover that fine-tuning on episodes of a particular shot can specialize the pre-trained model to solving episodes of that shot at the expense of performance on other shots, in agreement with a trade-off recently observed in the context of end-to-end episodic training.

Classification General Classification

An Effective Anti-Aliasing Approach for Residual Networks

no code implementations20 Nov 2020 Cristina Vasconcelos, Hugo Larochelle, Vincent Dumoulin, Nicolas Le Roux, Ross Goroshin

Image pre-processing in the frequency domain has traditionally played a vital role in computer vision and was even part of the standard pipeline in the early days of deep learning.

Few-Shot Learning Image Classification +1

Cross-Modulation Networks for Few-Shot Learning

no code implementations1 Dec 2018 Hugo Prol, Vincent Dumoulin, Luis Herranz

A family of recent successful approaches to few-shot learning relies on learning an embedding space in which predictions are made by computing similarities between examples.

Few-Shot Learning

Generative Adversarial Networks: An Overview

2 code implementations19 Oct 2017 Antonia Creswell, Tom White, Vincent Dumoulin, Kai Arulkumaran, Biswa Sengupta, Anil A. Bharath

Generative adversarial networks (GANs) provide a way to learn deep representations without extensively annotated training data.

General Classification Image Generation +2

Learning Visual Reasoning Without Strong Priors

2 code implementations10 Jul 2017 Ethan Perez, Harm de Vries, Florian Strub, Vincent Dumoulin, Aaron Courville

Previous work has operated under the assumption that visual reasoning calls for a specialized architecture, but we show that a general architecture with proper conditioning can learn to visually reason effectively.

Visual Reasoning

Exploring the structure of a real-time, arbitrary neural artistic stylization network

20 code implementations18 May 2017 Golnaz Ghiasi, Honglak Lee, Manjunath Kudlur, Vincent Dumoulin, Jonathon Shlens

In this paper, we present a method which combines the flexibility of the neural algorithm of artistic style with the speed of fast style transfer networks to allow real-time stylization using any content/style image pair.

Style Transfer

A Learned Representation For Artistic Style

12 code implementations24 Oct 2016 Vincent Dumoulin, Jonathon Shlens, Manjunath Kudlur

In this work we investigate the construction of a single, scalable deep network that can parsimoniously capture the artistic style of a diversity of paintings.

Diversity

Adversarially Learned Inference

9 code implementations2 Jun 2016 Vincent Dumoulin, Ishmael Belghazi, Ben Poole, Olivier Mastropietro, Alex Lamb, Martin Arjovsky, Aaron Courville

We introduce the adversarially learned inference (ALI) model, which jointly learns a generation network and an inference network using an adversarial process.

Image-to-Image Translation

Theano: A Python framework for fast computation of mathematical expressions

1 code implementation9 May 2016 The Theano Development Team, Rami Al-Rfou, Guillaume Alain, Amjad Almahairi, Christof Angermueller, Dzmitry Bahdanau, Nicolas Ballas, Frédéric Bastien, Justin Bayer, Anatoly Belikov, Alexander Belopolsky, Yoshua Bengio, Arnaud Bergeron, James Bergstra, Valentin Bisson, Josh Bleecher Snyder, Nicolas Bouchard, Nicolas Boulanger-Lewandowski, Xavier Bouthillier, Alexandre de Brébisson, Olivier Breuleux, Pierre-Luc Carrier, Kyunghyun Cho, Jan Chorowski, Paul Christiano, Tim Cooijmans, Marc-Alexandre Côté, Myriam Côté, Aaron Courville, Yann N. Dauphin, Olivier Delalleau, Julien Demouth, Guillaume Desjardins, Sander Dieleman, Laurent Dinh, Mélanie Ducoffe, Vincent Dumoulin, Samira Ebrahimi Kahou, Dumitru Erhan, Ziye Fan, Orhan Firat, Mathieu Germain, Xavier Glorot, Ian Goodfellow, Matt Graham, Caglar Gulcehre, Philippe Hamel, Iban Harlouchet, Jean-Philippe Heng, Balázs Hidasi, Sina Honari, Arjun Jain, Sébastien Jean, Kai Jia, Mikhail Korobov, Vivek Kulkarni, Alex Lamb, Pascal Lamblin, Eric Larsen, César Laurent, Sean Lee, Simon Lefrancois, Simon Lemieux, Nicholas Léonard, Zhouhan Lin, Jesse A. Livezey, Cory Lorenz, Jeremiah Lowin, Qianli Ma, Pierre-Antoine Manzagol, Olivier Mastropietro, Robert T. McGibbon, Roland Memisevic, Bart van Merriënboer, Vincent Michalski, Mehdi Mirza, Alberto Orlandi, Christopher Pal, Razvan Pascanu, Mohammad Pezeshki, Colin Raffel, Daniel Renshaw, Matthew Rocklin, Adriana Romero, Markus Roth, Peter Sadowski, John Salvatier, François Savard, Jan Schlüter, John Schulman, Gabriel Schwartz, Iulian Vlad Serban, Dmitriy Serdyuk, Samira Shabanian, Étienne Simon, Sigurd Spieckermann, S. Ramana Subramanyam, Jakub Sygnowski, Jérémie Tanguay, Gijs van Tulder, Joseph Turian, Sebastian Urban, Pascal Vincent, Francesco Visin, Harm de Vries, David Warde-Farley, Dustin J. Webb, Matthew Willson, Kelvin Xu, Lijun Xue, Li Yao, Saizheng Zhang, Ying Zhang

Since its introduction, it has been one of the most used CPU and GPU mathematical compilers - especially in the machine learning community - and has shown steady performance improvements.

BIG-bench Machine Learning Clustering +2

A guide to convolution arithmetic for deep learning

16 code implementations23 Mar 2016 Vincent Dumoulin, Francesco Visin

We introduce a guide to help deep learning practitioners understand and manipulate convolutional neural network architectures.

Deep Learning

Discriminative Regularization for Generative Models

1 code implementation9 Feb 2016 Alex Lamb, Vincent Dumoulin, Aaron Courville

We propose to take advantage of this by using the representations from discriminative classifiers to augment the objective function corresponding to a generative model.

On the Challenges of Physical Implementations of RBMs

no code implementations18 Dec 2013 Vincent Dumoulin, Ian J. Goodfellow, Aaron Courville, Yoshua Bengio

Restricted Boltzmann machines (RBMs) are powerful machine learning models, but learning and some kinds of inference in the model require sampling-based approximations, which, in classical digital computers, are implemented using expensive MCMC.

Cannot find the paper you are looking for? You can Submit a new open access paper.