1 code implementation • 25 May 2023 • Rémi Bardenet, Michaël Fanuel, Alexandre Feller
Most applications require sampling from a DPP, and given their quantum origin, it is natural to wonder whether sampling a DPP on a quantum computer is easier than on a classical one.
1 code implementation • 31 Aug 2022 • Michaël Fanuel, Rémi Bardenet
We provide statistical guarantees for a choice of natural estimators of the connection Laplacian, and investigate two practical applications of our sparsifiers: ranking with angular synchronization and graph-based semi-supervised learning.
1 code implementation • 2 Dec 2021 • Michaël Fanuel, Hemant Tyagi
We consider a fixed design setting where the modulo samples are given on a regular grid.
1 code implementation • NeurIPS 2021 • Michaël Fanuel, Rémi Bardenet
Determinantal Point Process (DPPs) are statistical models for repulsive point patterns.
no code implementations • 28 May 2021 • Joachim Schreurs, Michaël Fanuel, Johan A. K. Suykens
Determinantal point processes (DPPs) are well known models for diverse subset selection problems, including recommendation tasks, document summarization and image search.
no code implementations • 6 Apr 2021 • Joachim Schreurs, Hannes De Meulemeester, Michaël Fanuel, Bart De Moor, Johan A. K. Suykens
A generative model may overlook underrepresented modes that are less frequent in the empirical data distribution.
no code implementations • 13 Nov 2020 • Michaël Fanuel, Joachim Schreurs, Johan A. K. Suykens
Semi-parametric regression models are used in several applications which require comprehensibility without sacrificing accuracy.
no code implementations • 28 Sep 2020 • Hannes De Meulemeester, Joachim Schreurs, Michaël Fanuel, Bart De Moor, Johan Suykens
However, under certain circumstances, the training of GANs can lead to mode collapse or mode dropping, i. e. the generative models not being able to sample from the entire probability distribution.
1 code implementation • 10 Sep 2020 • Michaël Fanuel, Hemant Tyagi
The estimates of the samples $f(x_i)$ can be subsequently utilized to construct an estimate of the function $f$, with the aforementioned uniform error rate.
no code implementations • 24 Jun 2020 • Joachim Schreurs, Michaël Fanuel, Johan A. K. Suykens
By using the framework of Determinantal Point Processes (DPPs), some theoretical results concerning the interplay between diversity and regularization can be obtained.
no code implementations • 16 Jun 2020 • Hannes De Meulemeester, Joachim Schreurs, Michaël Fanuel, Bart De Moor, Johan A. K. Suykens
However, under certain circumstances, the training of GANs can lead to mode collapse or mode dropping, i. e. the generative models not being able to sample from the entire probability distribution.
no code implementations • 20 Feb 2020 • Michaël Fanuel, Joachim Schreurs, Johan A. K. Suykens
The Nystr\"om approximation -- based on a subset of landmarks -- gives a low rank approximation of the kernel matrix, and is known to provide a form of implicit regularization.
1 code implementation • 5 Feb 2020 • Henri De Plaen, Michaël Fanuel, Johan A. K. Suykens
In the context of kernel methods, the similarity between data points is encoded by the kernel function which is often defined thanks to the Euclidean distance, a common example being the squared exponential kernel.
no code implementations • 29 May 2019 • Michaël Fanuel, Joachim Schreurs, Johan A. K. Suykens
In this context, we propose a deterministic and a randomized adaptive algorithm for selecting landmark points within a training data set.
1 code implementation • 20 Nov 2017 • Michaël Fanuel, Antoine Aspeel, Jean-Charles Delvenne, Johan A. K. Suykens
In machine learning or statistics, it is often desirable to reduce the dimensionality of a sample of data points in a high dimensional space $\mathbb{R}^d$.
no code implementations • 21 Dec 2016 • Carlos M. Alaíz, Michaël Fanuel, Johan A. K. Suykens
A graph-based classification method is proposed for semi-supervised learning in the case of Euclidean data and for classification in the case of graph data.
no code implementations • 21 Oct 2016 • Carlos M. Alaíz, Michaël Fanuel, Johan A. K. Suykens
In this paper, Kernel PCA is reinterpreted as the solution to a convex optimization problem.