Search Results for author: Christian Gagné

Found 40 papers, 12 papers with code

Analyzing Data Augmentation for Medical Images: A Case Study in Ultrasound Images

1 code implementation14 Mar 2024 Adam Tupper, Christian Gagné

Data augmentation is one of the most effective techniques to improve the generalization performance of deep neural networks.

Data Augmentation

Improving genetic algorithms performance via deterministic population shrinkage

no code implementations22 Jan 2024 Juan Luis Jiménez Laredo, Carlos Fernandes, Juan Julián Merelo, Christian Gagné

Despite the intuition that the same population size is not needed throughout the run of an Evolutionary Algorithm (EA), most EAs use a fixed population size.

Filtering Pixel Latent Variables for Unmixing Noisy and Undersampled Volumetric Images

no code implementations8 Dec 2023 Catherine Bouchard, Andréanne Deschênes, Vincent Boulanger, Jean-Michel Bellavance, Flavie Lavoie-Cardinal, Christian Gagné

The development of robust signal unmixing algorithms is essential for leveraging multimodal datasets acquired through a wide array of scientific imaging technologies, including hyperspectral or time-resolved acquisitions.

Hessian Aware Low-Rank Weight Perturbation for Continual Learning

1 code implementation26 Nov 2023 Jiaqi Li, Rui Wang, Yuanhao Lai, Changjian Shui, Sabyasachi Sahoo, Charles X. Ling, Shichun Yang, Boyu Wang, Christian Gagné, Fan Zhou

We conduct extensive experiments on various benchmarks, including a dataset with large-scale tasks, and compare our method against some recent state-of-the-art methods to demonstrate the effectiveness and scalability of our proposed method.

Continual Learning

Domain Agnostic Image-to-image Translation using Low-Resolution Conditioning

no code implementations8 May 2023 Mohamed Abid, Arman Afrasiyabi, Ihsen Hedhli, Jean-François Lalonde, Christian Gagné

Conditioned on a target image, such methods extract the target style and combine it with the source image content, keeping coherence between the domains.

Image-to-Image Translation Translation

Improved Robustness Against Adaptive Attacks With Ensembles and Error-Correcting Output Codes

1 code implementation4 Mar 2023 Thomas Philippon, Christian Gagné

Neural network ensembles have been studied extensively in the context of adversarial robustness and most ensemble-based approaches remain vulnerable to adaptive attacks.

Adversarial Robustness

On Learning Fairness and Accuracy on Multiple Subgroups

1 code implementation19 Oct 2022 Changjian Shui, Gezheng Xu, Qi Chen, Jiaqi Li, Charles Ling, Tal Arbel, Boyu Wang, Christian Gagné

In the upper-level, the fair predictor is updated to be close to all subgroup specific predictors.

Fairness

Evolving Domain Generalization

no code implementations31 May 2022 William Wei Wang, Gezheng Xu, Ruizhi Pu, Jiaqi Li, Fan Zhou, Changjian Shui, Charles Ling, Christian Gagné, Boyu Wang

Domain generalization aims to learn a predictive model from multiple different but related source tasks that can generalize well to a target task without the need of accessing any target data.

Evolving Domain Generalization Meta-Learning

Fair Representation Learning through Implicit Path Alignment

no code implementations26 May 2022 Changjian Shui, Qi Chen, Jiaqi Li, Boyu Wang, Christian Gagné

We consider a fair representation learning perspective, where optimal predictors, on top of the data representation, are ensured to be invariant with respect to different sub-groups.

Fairness Representation Learning

Gap Minimization for Knowledge Sharing and Transfer

no code implementations26 Jan 2022 Boyu Wang, Jorge Mendez, Changjian Shui, Fan Zhou, Di wu, Gezheng Xu, Christian Gagné, Eric Eaton

Unlike existing measures which are used as tools to bound the difference of expected risks between tasks (e. g., $\mathcal{H}$-divergence or discrepancy distance), we theoretically show that the performance gap can be viewed as a data- and algorithm-dependent regularizer, which controls the model complexity and leads to finer guarantees.

Representation Learning Transfer Learning

On the benefits of representation regularization in invariance based domain generalization

no code implementations30 May 2021 Changjian Shui, Boyu Wang, Christian Gagné

Our regularization is orthogonal to and can be straightforwardly adopted in existing domain generalization algorithms for invariant representation learning.

Domain Generalization Representation Learning

Aggregating From Multiple Target-Shifted Sources

1 code implementation9 May 2021 Changjian Shui, Zijian Li, Jiaqi Li, Christian Gagné, Charles Ling, Boyu Wang

Multi-source domain adaptation aims at leveraging the knowledge from multiple tasks for predicting a related target domain.

Unsupervised Domain Adaptation

Meta Learning Black-Box Population-Based Optimizers

no code implementations5 Mar 2021 Hugo Siqueira Gomes, Benjamin Léger, Christian Gagné

From that framework's formulation, we propose to parameterize the algorithm using deep recurrent neural networks and use a meta-loss function based on stochastic algorithms' performance to train efficient data-driven optimizers over several related optimization tasks.

Meta-Learning

A Generative Model for Hallucinating Diverse Versions of Super Resolution Images

no code implementations12 Feb 2021 Mohamed Abderrahmen Abid, Ihsen Hedhli, Christian Gagné

Traditionally, the main focus of image super-resolution techniques is on recovering the most likely high-quality images from low-quality images, using a one-to-one low- to high-resolution mapping.

Image Super-Resolution valid

Unified Principles For Multi-Source Transfer Learning Under Label Shifts

no code implementations1 Jan 2021 Changjian Shui, Zijian Li, Jiaqi Li, Christian Gagné, Charles Ling, Boyu Wang

We study the label shift problem in multi-source transfer learning and derive new generic principles to control the target generalization risk.

Transfer Learning Unsupervised Domain Adaptation

Mixture-based Feature Space Learning for Few-shot Image Classification

1 code implementation ICCV 2021 Arman Afrasiyabi, Jean-François Lalonde, Christian Gagné

In contrast, we propose to model base classes with mixture models by simultaneously training the feature extractor and learning the mixture model parameters in an online manner.

Clustering Few-Shot Image Classification +2

Beyond $\mathcal{H}$-Divergence: Domain Adaptation Theory With Jensen-Shannon Divergence

no code implementations30 Jul 2020 Changjian Shui, Qi Chen, Jun Wen, Fan Zhou, Christian Gagné, Boyu Wang

We reveal the incoherence between the widely-adopted empirical domain adversarial training and its generally-assumed theoretical counterpart based on $\mathcal{H}$-divergence.

Domain Adaptation Transfer Learning

Input Dropout for Spatially Aligned Modalities

1 code implementation7 Feb 2020 Sébastien de Blois, Mathieu Garon, Christian Gagné, Jean-François Lalonde

Computer vision datasets containing multiple modalities such as color, depth, and thermal properties are now commonly accessible and useful for solving a wide array of challenging tasks.

Object Tracking Pedestrian Detection

Deep Active Learning: Unified and Principled Method for Query and Training

1 code implementation20 Nov 2019 Changjian Shui, Fan Zhou, Christian Gagné, Boyu Wang

In this paper, we are proposing a unified and principled method for both the querying and training processes in deep batch active learning.

Active Learning

Controlling Over-generalization and its Effect on Adversarial Examples Detection and Generation

no code implementations ICLR 2019 Mahdieh Abbasi, Arezoo Rajabi, Azadeh Sadat Mozafari, Rakesh B. Bobba, Christian Gagné

As an appropriate training set for the extra class, we introduce two resources that are computationally efficient to obtain: a representative natural out-distribution set and interpolated in-distribution samples.

A Principled Approach for Learning Task Similarity in Multitask Learning

1 code implementation21 Mar 2019 Changjian Shui, Mahdieh Abbasi, Louis-Émile Robitaille, Boyu Wang, Christian Gagné

Hence, an important aspect of multitask learning is to understand the similarities within a set of tasks.

Learning of Image Dehazing Models for Segmentation Tasks

1 code implementation4 Mar 2019 Sébastien de Blois, Ihsen Hedhli, Christian Gagné

To evaluate their performance, existing dehazing approaches generally rely on distance measures between the generated image and its corresponding ground truth.

Image Dehazing Image Segmentation +2

Attended Temperature Scaling: A Practical Approach for Calibrating Deep Neural Networks

no code implementations27 Oct 2018 Azadeh Sadat Mozafari, Hugo Siqueira Gomes, Wilson Leão, Steeven Janny, Christian Gagné

Temperature Scaling (TS) is a state-of-the-art among measure-based calibration methods which has low time and memory complexity as well as effectiveness.

Autonomous Driving Decision Making +1

Accumulating Knowledge for Lifelong Online Learning

no code implementations26 Oct 2018 Changjian Shui, Ihsen Hedhli, Christian Gagné

We are providing a theoretical analysis of this algorithm, with a cumulative error upper bound for each task.

Transfer Learning

Evaluating and Characterizing Incremental Learning from Non-Stationary Data

no code implementations18 Jun 2018 Alejandro Cervantes, Christian Gagné, Pedro Isasi, Marc Parizeau

Incremental learning from non-stationary data poses special challenges to the field of machine learning.

Incremental Learning

Towards Dependable Deep Convolutional Neural Networks (CNNs) with Out-distribution Learning

no code implementations24 Apr 2018 Mahdieh Abbasi, Arezoo Rajabi, Christian Gagné, Rakesh B. Bobba

Detection and rejection of adversarial examples in security sensitive and safety-critical systems using deep CNNs is essential.

Learning to Become an Expert: Deep Networks Applied To Super-Resolution Microscopy

no code implementations28 Mar 2018 Louis-Émile Robitaille, Audrey Durand, Marc-André Gardner, Christian Gagné, Paul De Koninck, Flavie Lavoie-Cardinal

More specifically, we are proposing a system based on a deep neural network that can provide a quantitative quality measure of a STED image of neuronal structures given as input.

Super-Resolution

Diversity regularization in deep ensembles

no code implementations22 Feb 2018 Changjian Shui, Azadeh Sadat Mozafari, Jonathan Marek, Ihsen Hedhli, Christian Gagné

Calibrating the confidence of supervised learning models is important for a variety of contexts where the certainty over predictions should be reliable.

Out-distribution training confers robustness to deep neural networks

1 code implementation20 Feb 2018 Mahdieh Abbasi, Christian Gagné

The easiness at which adversarial instances can be generated in deep neural networks raises some fundamental questions on their functioning and concerns on their use in critical systems.

Learning to Predict Indoor Illumination from a Single Image

no code implementations1 Apr 2017 Marc-André Gardner, Kalyan Sunkavalli, Ersin Yumer, Xiaohui Shen, Emiliano Gambaretto, Christian Gagné, Jean-François Lalonde

We propose an automatic method to infer high dynamic range illumination from a single, limited field-of-view, low dynamic range photograph of an indoor scene.

Lighting Estimation

Robustness to Adversarial Examples through an Ensemble of Specialists

no code implementations22 Feb 2017 Mahdieh Abbasi, Christian Gagné

We are proposing to use an ensemble of diverse specialists, where speciality is defined according to the confusion matrix.

Estimating Quality in Multi-Objective Bandits Optimization

no code implementations4 Jan 2017 Audrey Durand, Christian Gagné

The question is: how good do estimations of these objectives have to be in order for the solution maximizing the preference function to remain unchanged?

Thompson Sampling

Alternating Direction Method of Multipliers for Sparse Convolutional Neural Networks

no code implementations5 Nov 2016 Farkhondeh Kiaee, Christian Gagné, Mahdieh Abbasi

This method alternates between promoting the sparsity of the network and optimizing the recognition performance, which allows us to exploit the two-part structure of the corresponding objective functions.

Bayesian Hyperparameter Optimization for Ensemble Learning

no code implementations20 May 2016 Julien-Charles Lévesque, Christian Gagné, Robert Sabourin

Our method consists in building a fixed-size ensemble, optimizing the configuration of one classifier of the ensemble at each iteration of the hyperparameter optimization algorithm, taking into consideration the interaction with the other models when evaluating potential performances.

Bayesian Optimization Ensemble Learning +1

Sustainable Cooperative Coevolution with a Multi-Armed Bandit

no code implementations10 Apr 2013 François-Michel De Rainville, Michèle Sebag, Christian Gagné, Marc Schoenauer, Denis Laurendeau

At each iteration, the dynamic multi-armed bandit makes a decision on which species to evolve for a generation, using the history of progress made by the different species to guide the decisions.

Cannot find the paper you are looking for? You can Submit a new open access paper.