Search Results for author: Rémi Bardenet

Found 18 papers, 11 papers with code

On sampling determinantal and Pfaffian point processes on a quantum computer

1 code implementation25 May 2023 Rémi Bardenet, Michaël Fanuel, Alexandre Feller

Most applications require sampling from a DPP, and given their quantum origin, it is natural to wonder whether sampling a DPP on a quantum computer is easier than on a classical one.

Point Processes

Sparsification of the regularized magnetic Laplacian with multi-type spanning forests

1 code implementation31 Aug 2022 Michaël Fanuel, Rémi Bardenet

In the context of large and dense graphs, we study here sparsifiers of the magnetic Laplacian, i. e., spectral approximations based on subgraphs with few edges.

Vocal Bursts Type Prediction

A covariant, discrete time-frequency representation tailored for zero-based signal detection

1 code implementation8 Feb 2022 Barbara Pascal, Rémi Bardenet

Recent work in time-frequency analysis proposed to switch the focus from the maxima of the spectrogram toward its zeros, which, for signals corrupted by Gaussian noise, form a random point pattern with a very stable structure leveraged by modern spatial statistics tools to perform component disentanglement and signal detection.

Disentanglement

On proportional volume sampling for experimental design in general spaces

no code implementations9 Nov 2020 Arnaud Poinas, Rémi Bardenet

Optimal design for linear regression is a fundamental task in statistics.

Computation

Learning from DPPs via Sampling: Beyond HKPV and symmetry

no code implementations8 Jul 2020 Rémi Bardenet, Subhroshekhar Ghosh

Our approach is scalable and applies to very general DPPs, beyond traditional symmetric kernels.

feature selection Point Processes +2

Kernel interpolation with continuous volume sampling

no code implementations ICML 2020 Ayoub Belhadji, Rémi Bardenet, Pierre Chainais

A fundamental task in kernel methods is to pick nodes and weights, so as to approximate a given function from an RKHS by the weighted sum of kernel translates located at the nodes.

Density Estimation Point Processes

On two ways to use determinantal point processes for Monte Carlo integration

1 code implementation NeurIPS 2019 Guillaume Gautier, Rémi Bardenet, Michal Valko

In the absence of DPP machinery to derive an efficient sampler and analyze their estimator, the idea of Monte Carlo integration with DPPs was stored in the cellar of numerical integration.

Numerical Integration Point Processes

Kernel quadrature with DPPs

1 code implementation NeurIPS 2019 Ayoub Belhadji, Rémi Bardenet, Pierre Chainais

We study quadrature rules for functions from an RKHS, using nodes sampled from a determinantal point process (DPP).

A determinantal point process for column subset selection

no code implementations23 Dec 2018 Ayoub Belhadji, Rémi Bardenet, Pierre Chainais

We give bounds on the ratio of the expected approximation error for this DPP over the optimal error of PCA.

Dimensionality Reduction feature selection

DPPy: Sampling DPPs with Python

2 code implementations19 Sep 2018 Guillaume Gautier, Guillermo Polito, Rémi Bardenet, Michal Valko

Determinantal point processes (DPPs) are specific probability distributions over clouds of points that are used as models and computational tools across physics, probability, statistics, and more recently machine learning.

BIG-bench Machine Learning Point Processes

Time-frequency transforms of white noises and Gaussian analytic functions

1 code implementation30 Jul 2018 Rémi Bardenet, Adrien Hardy

Finally, we provide quantitative estimates concerning the finite-dimensional approximations of these white noises, which is of practical interest when it comes to implementing signal processing algorithms based on GAFs.

Probability Classical Analysis and ODEs Methodology

Zonotope hit-and-run for efficient sampling from projection DPPs

1 code implementation ICML 2017 Guillaume Gautier, Rémi Bardenet, Michal Valko

Previous theoretical results yield a fast mixing time of our chain when targeting a distribution that is close to a projection DPP, but not a DPP in general.

Point Processes Recommendation Systems

Monte Carlo with Determinantal Point Processes

1 code implementation2 May 2016 Rémi Bardenet, Adrien Hardy

We show that repulsive random variables can yield Monte Carlo methods with faster convergence rates than the typical $N^{-1/2}$, where $N$ is the number of integrand evaluations.

Probability Classical Analysis and ODEs Computation Methodology

Inference for determinantal point processes without spectral knowledge

no code implementations NeurIPS 2015 Rémi Bardenet, Michalis K. Titsias

DPPs possess desirable properties, such as exact sampling or analyticity of the moments, but learning the parameters of kernel $K$ through likelihood-based inference is not straightforward.

Point Processes Variational Inference

On Markov chain Monte Carlo methods for tall data

1 code implementation11 May 2015 Rémi Bardenet, Arnaud Doucet, Chris Holmes

Finally, we have only been able so far to propose subsampling-based methods which display good performance in scenarios where the Bernstein-von Mises approximation of the target posterior distribution is excellent.

Bayesian Inference

Algorithms for Hyper-Parameter Optimization

no code implementations NeurIPS 2011 James S. Bergstra, Rémi Bardenet, Yoshua Bengio, Balázs Kégl

Random search has been shown to be sufficiently efficient for learning neural networks for several datasets, but we show it is unreliable for training DBNs.

Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.