no code implementations • 21 Apr 2023 • Matthieu Cordonnier, Nicolas Keriven, Nicolas Tremblay, Samuel Vaiter
We study the convergence of message passing graph neural networks on random graph models to their continuous counterpart as the number of nodes tends to infinity.
1 code implementation • 31 Oct 2022 • Simon Barthelmé, Nicolas Tremblay, Pierre-Olivier Amblard
Finally, an interesting by-product of the analysis is that a realisation from a DPP is typically contained in a subset of size O(m log m) formed using leverage score i. i. d.
no code implementations • 15 Jun 2022 • Yusuf Yigit Pilavci, Pierre-Olivier Amblard, Simon Barthelme, Nicolas Tremblay
The trace $\tr(q(\ma{L} + q\ma{I})^{-1})$, where $\ma{L}$ is a symmetric diagonally dominant matrix, is the quantity of interest in some machine learning problems.
no code implementations • 15 Oct 2021 • Yusuf Pilavci, Pierre-Olivier Amblard, Simon Barthelmé, Nicolas Tremblay
Large dimensional least-squares and regularised least-squares problems are expensive to solve.
1 code implementation • 5 Mar 2021 • Lorenzo Dall'Amico, Romain Couillet, Nicolas Tremblay
This article unveils a new relation between the Nishimori temperature parametrizing a distribution P and the Bethe free energy on random Erdos-Renyi graphs with edge weights distributed according to P. Estimating the Nishimori temperature being a task of major importance in Bayesian inference problems, as a practical corollary of this new relation, a numerical method is proposed to accurately estimate the Nishimori temperature from the eigenvalues of the Bethe Hessian matrix of the weighted graph.
1 code implementation • 16 Oct 2020 • Hashem Ghanem, Nicolas Keriven, Nicolas Tremblay
If this method can still be prohibitively costly for usual random features, we then incorporate optical random features that can be computed in constant time.
1 code implementation • NeurIPS 2020 • Lorenzo Dall'Amico, Romain Couillet, Nicolas Tremblay
This article considers the problem of community detection in sparse dynamical graphs in which the community structure evolves over time.
1 code implementation • 20 Mar 2020 • Lorenzo Dall'Amico, Romain Couillet, Nicolas Tremblay
This article considers spectral community detection in the regime of sparse networks with heterogeneous degree distributions, for which we devise an algorithm to efficiently retrieve communities.
no code implementations • 3 Dec 2019 • Lorenzo Dall'Amico, Romain Couillet, Nicolas Tremblay
Regularization of the classical Laplacian matrices was empirically shown to improve spectral clustering in sparse networks.
no code implementations • 29 Jan 2019 • Nicolas Tremblay, Andreas Loukas
Spectral clustering refers to a family of unsupervised learning algorithms that compute a spectral embedding of the original data based on the eigenvectors of a similarity graph.
no code implementations • NeurIPS 2019 • Lorenzo Dall'Amico, Romain Couillet, Nicolas Tremblay
Spectral clustering is one of the most popular, yet still incompletely understood, methods for community detection on graphs.
2 code implementations • 23 Mar 2018 • Nicolas Tremblay, Simon Barthelmé, Pierre-Olivier Amblard
We apply our results to both the k-means and the linear regression problems, and give extensive empirical evidence that the small additional computational cost of DPP sampling comes with superior performance over its iid counterpart.
no code implementations • 5 Mar 2018 • Simon Barthelmé, Pierre-Olivier Amblard, Nicolas Tremblay
In this work we show that as the size of the ground set grows, $k$-DPPs and DPPs become equivalent, meaning that their inclusion probabilities converge.
2 code implementations • 23 Feb 2018 • Nicolas Tremblay, Simon Barthelme, Pierre-Olivier Amblard
The standard sampling algorithm is separated in three phases: 1/~eigendecomposition of $\mathbf{L}$, 2/~an eigenvector sampling phase where $\mathbf{L}$'s eigenvectors are sampled independently via a Bernoulli variable parametrized by their associated eigenvalue, 3/~a Gram-Schmidt-type orthogonalisation procedure of the sampled eigenvectors.
no code implementations • 3 Nov 2017 • Nicolas Tremblay, Paulo Gonçalves, Pierre Borgnat
The aim of this chapter is to review general concepts for the introduction of filters and representations of graph signals.
Signal Processing Information Theory Social and Information Networks Information Theory
no code implementations • 7 Apr 2017 • Nicolas Tremblay, Simon Barthelme, Pierre-Olivier Amblard
We consider the problem of sampling k-bandlimited graph signals, ie, linear combinations of the first k graph Fourier modes.
no code implementations • 5 Mar 2017 • Nicolas Tremblay, Pierre-Olivier Amblard, Simon Barthelmé
For large graphs, ie, in cases where the graph's spectrum is not accessible, we investigate, both theoretically and empirically, a sub-optimal but much faster DPP based on loop-erased random walks on the graph.
no code implementations • 27 Oct 2016 • Nicolas Keriven, Nicolas Tremblay, Yann Traonmilin, Rémi Gribonval
We demonstrate empirically that CKM performs similarly to Lloyd-Max, for a sketch size proportional to the number of cen-troids times the ambient dimension, and independent of the size of the original dataset.
no code implementations • 5 Feb 2016 • Nicolas Tremblay, Gilles Puy, Remi Gribonval, Pierre Vandergheynst
Spectral clustering has become a popular technique due to its high performance in many contexts.
no code implementations • 16 Nov 2015 • Gilles Puy, Nicolas Tremblay, Rémi Gribonval, Pierre Vandergheynst
On the contrary, the second strategy is adaptive but yields optimal results.
no code implementations • 29 Sep 2015 • Nicolas Tremblay, Gilles Puy, Pierre Borgnat, Remi Gribonval, Pierre Vandergheynst
We build upon recent advances in graph signal processing to propose a faster spectral clustering algorithm.
Social and Information Networks Numerical Analysis