1 code implementation • 5 Nov 2024 • Imen Ayadi, Florent Bouchard, Frédéric Pascal
This paper deals with Elliptical Wishart distributions - which generalize the Wishart distribution - in the context of signal processing and machine learning.
1 code implementation • 10 May 2024 • Florent Bouchard, Ammar Mian, Malik Tiomoko, Guillaume Ginolhac, Frédéric Pascal
In this study, we consider the realm of covariance matrices in machine learning, particularly focusing on computing Fr\'echet means on the manifold of symmetric positive definite matrices, commonly referred to as Karcher or geometric means.
no code implementations • 16 Jan 2024 • Jasin Machkour, Arnaud Breloy, Michael Muma, Daniel P. Palomar, Frédéric Pascal
Sparse principal component analysis (PCA) aims at mapping large dimensional data to a linear subspace of lower dimension.
no code implementations • 12 Dec 2023 • Nora Ouzir, Frédéric Pascal, Jean-Christophe Pesquet
In robust estimation, imposing classical constraints on the precision matrix, such as sparsity, has been limited by the non-convexity of the resulting cost function.
no code implementations • 30 Nov 2023 • Frédéric Chazal, Laure Ferraris, Pablo Groisman, Matthieu Jonckheere, Frédéric Pascal, Facundo Sapienza
The Fermat distance has been recently established as a useful tool for machine learning tasks when a natural distance is not directly available to the practitioner or to improve the results given by Euclidean distances by exploding the geometrical and statistical properties of the dataset.
1 code implementation • 28 Jan 2022 • Florian Mouret, Alexandre Hippert-Ferrer, Frédéric Pascal, Jean-Yves Tourneret
To overcome this issue, a new EM algorithm is investigated for mixtures of elliptical distributions with the property of handling potential missing data.
1 code implementation • 9 Jan 2022 • Pierre Houdouin, Frédéric Pascal, Matthieu Jonckheere, Andrew Wang
Linear and Quadratic Discriminant Analysis are well-known classical methods but can heavily suffer from non-Gaussian distributions and/or contaminated datasets, mainly because of the underlying Gaussian assumption that is not robust.
no code implementations • 1 Nov 2021 • Malik Tiomoko, Romain Couillet, Frédéric Pascal
The article proposes and theoretically analyses a \emph{computationally efficient} multi-task learning (MTL) extension of popular principal component analysis (PCA)-based supervised learning schemes \cite{barshan2011supervised, bair2006prediction}.
no code implementations • 19 Oct 2021 • Alexandre Hippert-Ferrer, Ammar Mian, Florent Bouchard, Frédéric Pascal
This paper proposes a strategy to handle missing data for the classification of electroencephalograms using covariance matrices.
no code implementations • 27 Feb 2020 • Stefano Fortunati, Alexandre Renaux, Frédéric Pascal
This paper aims at presenting a simulative analysis of the main properties of a new $R$-estimator of shape matrices in Complex Elliptically Symmetric (CES) distributed observations.
3 code implementations • 6 Feb 2020 • Stefano Fortunati, Alexandre Renaux, Frédéric Pascal
The class of elliptical distributions can be seen as a semiparametric model where the finite-dimensional vector of interest is given by the location vector and by the (vectorized) covariance/scatter matrix, while the density generator represents an infinite-dimensional nuisance function.
2 code implementations • 2 Jul 2019 • Violeta Roizman, Matthieu Jonckheere, Frédéric Pascal
Though very popular, it is well known that the EM for GMM algorithm suffers from non-Gaussian distribution shapes, outliers and high-dimensionality.