no code implementations • 19 Feb 2024 • Paul Krzakala, Junjie Yang, Rémi Flamary, Florence d'Alché-Buc, Charlotte Laclau, Matthieu Labeau
We present a novel end-to-end deep learning-based approach for Supervised Graph Prediction (SGP).
no code implementations • 3 Feb 2024 • Hugues van Assel, Cédric Vincent-Cuaz, Nicolas Courty, Rémi Flamary, Pascal Frossard, Titouan Vayer
Unsupervised learning aims to capture the underlying structure of potentially large and high-dimensional datasets.
no code implementations • 24 Jan 2024 • Antoine Collas, Rémi Flamary, Alexandre Gramfort
This paper introduces a novel domain adaptation technique for time series data, called Mixing model Stiefel Adaptation (MSA), specifically addressing the challenge of limited labeled signals in the target dataset.
no code implementations • 5 Oct 2023 • Hugues van Assel, Cédric Vincent-Cuaz, Titouan Vayer, Rémi Flamary, Nicolas Courty
We present a versatile adaptation of existing dimensionality reduction (DR) objectives, enabling the simultaneous reduction of both sample and feature sizes.
no code implementations • 19 Jul 2023 • Eloi Tanguy, Rémi Flamary, Julie Delon
We investigate the regularity and optimisation properties of this energy, as well as its Monte-Carlo approximation $\mathcal{E}_p$ (estimating the expectation in SW using only $p$ samples) and show convergence results on the critical points of $\mathcal{E}_p$ to those of $\mathcal{E}$, as well as an almost-sure uniform convergence and a uniform Central Limit result on the process $\mathcal{E}_p(Y)$.
1 code implementation • 30 May 2023 • Théo Gnassounou, Rémi Flamary, Alexandre Gramfort
In many machine learning applications on signals and biomedical data, especially electroencephalogram (EEG), one major challenge is the variability of the data across subjects, sessions, and hardware devices.
1 code implementation • 9 Mar 2023 • Antoine Collas, Titouan Vayer, Rémi Flamary, Arnaud Breloy
Dimension reduction (DR) methods provide systematic approaches for analyzing high-dimensional data.
1 code implementation • 19 Jun 2022 • Alexis Thual, Huy Tran, Tatiana Zemskova, Nicolas Courty, Rémi Flamary, Stanislas Dehaene, Bertrand Thirion
We demonstrate that FUGW is well-suited for whole-brain landmark-free alignment.
1 code implementation • 31 May 2022 • Cédric Vincent-Cuaz, Rémi Flamary, Marco Corneli, Titouan Vayer, Nicolas Courty
Current Graph Neural Networks (GNN) architectures generally rely on two important components: node features embedding through message passing, and aggregation with a specialized form of pooling.
Ranked #1 on Graph Classification on NCI1
no code implementations • 30 May 2022 • Quang Huy Tran, Hicham Janati, Nicolas Courty, Rémi Flamary, Ievgen Redko, Pinar Demetci, Ritambhara Singh
With this result in hand, we provide empirical evidence of this robustness for the challenging tasks of heterogeneous domain adaptation with and without varying proportions of classes and simultaneous alignment of samples and features across single-cell measurements.
no code implementations • 20 Apr 2022 • Dimitri Bouche, Rémi Flamary, Florence d'Alché-Buc, Riwal Plougonven, Marianne Clausel, Jordi Badosa, Philippe Drobinski
We study short-term prediction of wind speed and wind power (every 10 minutes up to 4 hours ahead).
1 code implementation • 8 Feb 2022 • Luc Brogat-Motte, Rémi Flamary, Céline Brouard, Juho Rousu, Florence d'Alché-Buc
This paper introduces a novel and generic framework to solve the flagship task of supervised labeled graph prediction by leveraging Optimal Transport tools.
1 code implementation • 6 Oct 2021 • Cédric Vincent-Cuaz, Rémi Flamary, Marco Corneli, Titouan Vayer, Nicolas Courty
To this end, the Gromov-Wasserstein (GW) distance, based on Optimal Transport (OT), has proven to be successful in handling the specific nature of the associated objects.
no code implementations • 1 Oct 2021 • Quang Huy Tran, Hicham Janati, Ievgen Redko, Rémi Flamary, Nicolas Courty
Optimal transport (OT) theory underlies many emerging machine learning (ML) methods nowadays solving a wide range of tasks such as generative modeling, transfer learning and information retrieval.
no code implementations • ICLR 2022 • Cédric Vincent-Cuaz, Rémi Flamary, Marco Corneli, Titouan Vayer, Nicolas Courty
To this end, the Gromov-Wasserstein (GW) distance, based on Optimal Transport (OT), has proven to be successful in handling the specific nature of the associated objects.
1 code implementation • NeurIPS 2021 • Laetitia Chapel, Rémi Flamary, Haoran Wu, Cédric Févotte, Gilles Gasso
In particular, we consider majorization-minimization which leads in our setting to efficient multiplicative updates for a variety of penalties.
2 code implementations • 5 Mar 2021 • Kilian Fatras, Thibault Séjourné, Nicolas Courty, Rémi Flamary
Optimal transport distances have found many applications in machine learning for their capacity to compare non-parametric probability distributions.
1 code implementation • 12 Feb 2021 • Cédric Vincent-Cuaz, Titouan Vayer, Rémi Flamary, Marco Corneli, Nicolas Courty
Dictionary learning is a key tool for representation learning, that explains the data as linear combination of few basic elements.
Ranked #1 on Graph Classification on BZR
2 code implementations • 5 Jan 2021 • Kilian Fatras, Younes Zine, Szymon Majewski, Rémi Flamary, Rémi Gribonval, Nicolas Courty
We notably argue that the minibatch strategy comes with appealing properties such as unbiased estimators, gradients and a concentration bound around the expectation, but also with limits: the minibatch OT is not a distance.
no code implementations • 13 Jul 2020 • Xuhong Li, Yves GRANDVALET, Rémi Flamary, Nicolas Courty, Dejing Dou
We use optimal transport to quantify the match between two representations, yielding a distance that embeds some invariances inherent to the representation of deep networks.
no code implementations • 24 Jun 2020 • Alain Rakotomamonjy, Rémi Flamary, Gilles Gasso, Joseph Salmon
Owing to their statistical properties, non-convex sparse regularizers have attracted much interest for estimating a sparse linear model from high dimensional data.
1 code implementation • 23 Jun 2020 • Rosanna Turrisi, Rémi Flamary, Alain Rakotomamonjy, Massimiliano Pontil
The problem of domain adaptation on an unlabeled target dataset using knowledge from multiple labelled source datasets is becoming increasingly important.
1 code implementation • 15 Jun 2020 • Alain Rakotomamonjy, Rémi Flamary, Gilles Gasso, Mokhtar Z. Alaya, Maxime Berar, Nicolas Courty
We address the problem of unsupervised domain adaptation under the setting of generalized target shift (joint class-conditional and label shifts).
1 code implementation • 10 Feb 2020 • Titouan Vayer, Romain Tavenard, Laetitia Chapel, Nicolas Courty, Rémi Flamary, Yann Soullard
Multivariate time series are ubiquitous objects in signal processing.
1 code implementation • NeurIPS 2020 • Ievgen Redko, Titouan Vayer, Rémi Flamary, Nicolas Courty
Optimal transport (OT) is a powerful geometric and probabilistic tool for finding correspondences and measuring similarity between two distributions.
3 code implementations • 9 Oct 2019 • Kilian Fatras, Younes Zine, Rémi Flamary, Rémi Gribonval, Nicolas Courty
Optimal transport distances are powerful tools to compare probability distributions and have found many applications in machine learning.
no code implementations • 28 Jun 2019 • Laurent Dragoni, Rémi Flamary, Karim Lounici, Patricia Reynaud-Bouret
Spike sorting is a fundamental preprocessing step in neuroscience that is central to access simultaneous but distinct neuronal activities and therefore to better understand the animal or even human brain.
no code implementations • 24 May 2019 • Rémi Flamary, Karim Lounici, André Ferrari
This article investigates the quality of the estimator of the linear Monge mapping between distributions.
1 code implementation • NeurIPS 2019 • Titouan Vayer, Rémi Flamary, Romain Tavenard, Laetitia Chapel, Nicolas Courty
Recently used in various machine learning contexts, the Gromov-Wasserstein distance (GW) allows for comparing distributions whose supports do not necessarily lie in the same metric space.
1 code implementation • 8 Apr 2019 • Kilian Fatras, Bharath Bhushan Damodaran, Sylvain Lobry, Rémi Flamary, Devis Tuia, Nicolas Courty
Noisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping.
1 code implementation • 7 Nov 2018 • Titouan Vayer, Laetita Chapel, Rémi Flamary, Romain Tavenard, Nicolas Courty
Optimal transport theory has recently found many applications in machine learning thanks to its capacity for comparing various machine learning objects considered as distributions.
no code implementations • 2 Oct 2018 • Bharath Bhushan Damodaran, Rémi Flamary, Viven Seguy, Nicolas Courty
The state-of-the-art performances of deep neural networks are conditioned to the availability of large number of accurately labeled samples.
2 code implementations • 23 May 2018 • Titouan Vayer, Laetitia Chapel, Rémi Flamary, Romain Tavenard, Nicolas Courty
This work considers the problem of computing distances between structured objects such as undirected graphs, seen as probability distributions in a specific metric space.
Ranked #3 on Graph Classification on NCI1
4 code implementations • ECCV 2018 • Bharath Bhushan Damodaran, Benjamin Kellenberger, Rémi Flamary, Devis Tuia, Nicolas Courty
In computer vision, one is often confronted with problems of domain shifts, which occur when one applies a classifier trained on a source dataset to target data sharing similar characteristics (e. g. same classes), but also different latent data structures (e. g. different acquisition conditions).
Ranked #2 on Domain Adaptation on MNIST-to-MNIST-M
3 code implementations • 13 Mar 2018 • Ievgen Redko, Nicolas Courty, Rémi Flamary, Devis Tuia
In this paper, we propose to tackle the problem of reducing discrepancies between multiple domains referred to as multi-source domain adaptation and consider it under the target shift assumption: in all domains we aim to solve a classification problem with the same output classes, but with labels' proportions differing across them.
no code implementations • 1 Mar 2018 • Alain Rakotomamonjy, Abraham Traoré, Maxime Berar, Rémi Flamary, Nicolas Courty
This paper presents a distance-based discriminative framework for learning with probability distributions.
no code implementations • 30 Nov 2017 • Ibrahim El Khalil Harrane, Rémi Flamary, Cédric Richard
While these data may be processed in a centralized manner, it is often more suitable to consider distributed strategies such as diffusion as they are scalable and can handle large amounts of data by distributing tasks over networked agents.
2 code implementations • 7 Nov 2017 • Vivien Seguy, Bharath Bhushan Damodaran, Rémi Flamary, Nicolas Courty, Antoine Rolet, Mathieu Blondel
We prove two theoretical stability results of regularized OT which show that our estimations converge to the OT plan and Monge map between the underlying continuous measures.
1 code implementation • ICLR 2018 • Nicolas Courty, Rémi Flamary, Mélanie Ducoffe
Our goal is to alleviate this problem by providing an approximation mechanism that allows to break its inherent complexity.
2 code implementations • NeurIPS 2017 • Nicolas Courty, Rémi Flamary, Amaury Habrard, Alain Rakotomamonjy
This paper deals with the unsupervised domain adaptation problem, where one wants to estimate a prediction function $f$ in a given target domain without any labeled sample by exploiting the knowledge available from a source domain where labels are known.
no code implementations • 10 Mar 2017 • Rita Ammanouil, André Ferrari, Rémi Flamary, Chiara Ferrari, David Mary
Image reconstruction algorithms for radio interferometry are challenged to scale well with TeraByte image sizes never seen before.
2 code implementations • 14 Dec 2016 • Rémi Flamary
State of the art methods in astronomical image reconstruction rely on the resolution of a regularized or constrained optimization problem.
no code implementations • NeurIPS 2016 • Michaël Perrot, Nicolas Courty, Rémi Flamary, Amaury Habrard
Most of the computational approaches of Optimal Transport use the Kantorovich relaxation of the problem to learn a probabilistic coupling $\mgamma$ but do not address the problem of learning the underlying transport map $\funcT$ linked to the original Monge problem.
1 code implementation • NeurIPS 2016 • Rémi Flamary, Cédric Févotte, Nicolas Courty, Valentin Emiya
Many spectral unmixing methods rely on the non-negative decomposition of spectral data onto a dictionary of spectral templates.
1 code implementation • 29 Aug 2016 • Rémi Flamary, Marco Cuturi, Nicolas Courty, Alain Rakotomamonjy
Wasserstein Discriminant Analysis (WDA) is a new supervised method that can improve classification of high-dimensional data by computing a suitable linear map onto a lower dimensional subspace.
no code implementations • 23 Jun 2016 • Devis Tuia, Rémi Flamary, Nicolas Courty
In this paper, we tackle the question of discovering an effective set of spatial filters to solve hyperspectral classification problems.
no code implementations • 23 Jun 2016 • Rémi Flamary, Alain Rakotomamonjy, Gilles Gasso
As the number of samples and dimensionality of optimization problems related to statistics an machine learning explode, block coordinate descent algorithms have gained popularity since they reduce the original problem to several smaller ones.
no code implementations • 22 Oct 2015 • Alain Rakotomamonjy, Rémi Flamary, Nicolas Courty
The objectives of this technical report is to provide additional results on the generalized conditional gradient methods introduced by Bredies et al. [BLM05].
no code implementations • 2 Jul 2015 • Nicolas Courty, Rémi Flamary, Devis Tuia, Alain Rakotomamonjy
Domain adaptation from one data space (or domain) to another is one of the most challenging tasks of modern data analytics.
no code implementations • 2 Jul 2015 • Léa Laporte, Rémi Flamary, Stephane Canu, Sébastien Déjean, Josiane Mothe
Feature selection in learning to rank has recently emerged as a crucial issue.
no code implementations • 2 Jul 2015 • André Ferrari, David Mary, Rémi Flamary, Cédric Richard
Current and future radio interferometric arrays such as LOFAR and SKA are characterized by a paradox.
no code implementations • 14 Mar 2014 • Rémi Flamary, Nisrine Jrad, Ronald Phlypo, Marco Congedo, Alain Rakotomamonjy
This framework is extended to the multi-task learning situation where several similar classification tasks related to different subjects are learned simultaneously.
no code implementations • Front. Neurosci., Sec. Neuroprosthetics 2012 • Rémi Flamary, Alain Rakotomamonjy
As a witness of the BCI community increasing interest toward such a problem, the fourth BCI Competition provides a dataset which aim is to predict individual finger movements from ECoG signals.