no code implementations • 4 Apr 2024 • Mokhtar Z. Alaya, Alain Rakotomamonjy, Maxime Berar, Gilles Gasso
We particularly focus on the Gaussian smoothed sliced Wasserstein distance and prove that it converges with a rate $O(n^{-1/2})$.
no code implementations • 12 Dec 2023 • Marwa Kechaou, Mokhtar Z. Alaya, Romain Hérault, Gilles Gasso
Adversarial learning baselines for domain adaptation (DA) approaches in the context of semantic segmentation are under explored in semi-supervised framework.
1 code implementation • 4 Jul 2023 • Guillaume Mahey, Laetitia Chapel, Gilles Gasso, Clément Bonet, Nicolas Courty
Wasserstein distance (WD) and the associated optimal transport plan have been proven useful in many applications where probability measures are at stake.
no code implementations • 15 Jun 2022 • Cyprien Ruffino, Rachel Blin, Samia Ainouz, Gilles Gasso, Romain Hérault, Fabrice Meriaudeau, Stéphane Canu
Polarimetric imaging, along with deep learning, has shown improved performances on different tasks including scene analysis.
no code implementations • 20 Oct 2021 • Alain Rakotomamonjy, Mokhtar Z. Alaya, Maxime Berar, Gilles Gasso
In this paper, we analyze the theoretical properties of this distance as well as those of generalized versions denoted as Gaussian smoothed sliced divergences.
1 code implementation • NeurIPS 2021 • Laetitia Chapel, Rémi Flamary, Haoran Wu, Cédric Févotte, Gilles Gasso
In particular, we consider majorization-minimization which leads in our setting to efficient multiplicative updates for a variety of penalties.
no code implementations • 4 Jun 2021 • Mokhtar Z. Alaya, Gilles Gasso, Maxime Berar, Alain Rakotomamonjy
We provide a theoretical analysis of this new divergence, called $\textit{heterogeneous Wasserstein discrepancy (HWD)}$, and we show that it preserves several interesting properties including rotation-invariance.
no code implementations • 29 May 2021 • Komi Midzodzi Pékpé, Djamel Zitouni, Gilles Gasso, Wajdi Dhifli, Benjamin C. Guinhouya
When applied to Brazil's cases, SEAIRD produced an excellent agreement to the data, with an %coefficient of determination $R^2$ $\geq 90\%$.
no code implementations • NeurIPS 2020 • Laetitia Chapel, Mokhtar Z. Alaya / Laboratoire LITIS, Université de Rouen Normandie, Gilles Gasso
Classical optimal transport problem seeks a transportation map that preserves the total mass between two probability distributions, requiring their masses to be equal.
no code implementations • 2 Oct 2020 • Marwa Kechaou, Romain Hérault, Mokhtar Z. Alaya, Gilles Gasso
We present a 2-step optimal transport approach that performs a mapping from a source distribution to a target distribution.
no code implementations • 24 Jun 2020 • Alain Rakotomamonjy, Rémi Flamary, Gilles Gasso, Joseph Salmon
Owing to their statistical properties, non-convex sparse regularizers have attracted much interest for estimating a sparse linear model from high dimensional data.
1 code implementation • 15 Jun 2020 • Alain Rakotomamonjy, Rémi Flamary, Gilles Gasso, Mokhtar Z. Alaya, Maxime Berar, Nicolas Courty
We address the problem of unsupervised domain adaptation under the setting of generalized target shift (joint class-conditional and label shifts).
3 code implementations • 10 Jun 2020 • Benjamin Deguerre, Clement Chatelain, Gilles Gasso
To gain in efficiency, this paper proposes to take advantage of the compressed representation of images to carry out object detection usable in constrained resources conditions.
no code implementations • 11 May 2020 • Michel Olvera, Emmanuel Vincent, Romain Serizel, Gilles Gasso
Ambient sound scenes typically comprise multiple short events occurring on top of a somewhat stationary background.
no code implementations • 19 Feb 2020 • Mokhtar Z. Alaya, Maxime Bérar, Gilles Gasso, Alain Rakotomamonjy
Unlike Gromov-Wasserstein (GW) distance which compares pairwise distances of elements from each distribution, we consider a method allowing to embed the metric measure spaces in a common Euclidean space and compute an optimal transport (OT) on the embedded distributions.
3 code implementations • 19 Feb 2020 • Laetitia Chapel, Mokhtar Z. Alaya, Gilles Gasso
In this paper, we address the partial Wasserstein and Gromov-Wasserstein problems and propose exact algorithms to solve them.
no code implementations • 4 Feb 2020 • Cyprien Ruffino, Romain Hérault, Eric Laloy, Gilles Gasso
We investigate the influence of this regularization term on the quality of the generated images and the fulfillment of the given pixel constraints.
1 code implementation • 2 Nov 2019 • Cyprien Ruffino, Romain Hérault, Eric Laloy, Gilles Gasso
In this paper, we study the effectiveness of conditioning GANs by adding an explicit regularization term to enforce pixel-wise conditions when very few pixel values are provided.
1 code implementation • NeurIPS 2019 • Mokhtar Z. Alaya, Maxime Bérar, Gilles Gasso, Alain Rakotomamonjy
We introduce in this paper a novel strategy for efficiently approximating the Sinkhorn distance between two discrete measures.
no code implementations • 15 May 2019 • Cyprien Ruffino, Romain Hérault, Eric Laloy, Gilles Gasso
In combination with convolutional (for the discriminator) and de-convolutional (for the generator) layers, they are particularly suitable for image generation, especially of natural scenes.
2 code implementations • 16 Apr 2019 • Benjamin Deguerre, Clément Chatelain, Gilles Gasso
Object detection in still images has drawn a lot of attention over past few years, and with the advent of Deep Learning impressive performances have been achieved with numerous industrial applications.
no code implementations • 16 Feb 2019 • Alain Rakotomamonjy, Gilles Gasso, Joseph Salmon
Leveraging on the convexity of the Lasso problem , screening rules help in accelerating solvers by discarding irrelevant variables, during the optimization process.
1 code implementation • 21 Dec 2018 • Eric Laloy, Niklas Linde, Cyprien Ruffino, Romain Hérault, Gilles Gasso, Diedrik Jacques
Global probabilistic inversion within the latent space learned by Generative Adversarial Networks (GAN) has been recently demonstrated (Laloy et al., 2018).
Geophysics
no code implementations • 12 Dec 2018 • Imad Rida, Romain Hérault, Gilles Gasso
Motivated by this need of a principled framework across domain applications for machine listening, we propose a generic and data-driven representation learning approach.
no code implementations • 23 Jun 2016 • Rémi Flamary, Alain Rakotomamonjy, Gilles Gasso
As the number of samples and dimensionality of optimization problems related to statistics an machine learning explode, block coordinate descent algorithms have gained popularity since they reduce the original problem to several smaller ones.
no code implementations • 20 Aug 2015 • Alain Rakotomamonjy, Gilles Gasso
This paper addresses the problem of audio scenes classification and contributes to the state of the art by proposing a novel feature.
no code implementations • 2 Jul 2015 • Alain Rakotomamonjy, Remi Flamary, Gilles Gasso
We introduce a novel algorithm for solving learning problems where both the loss function and the regularizer are non-convex but belong to the class of difference of convex (DC) functions.