no code implementations • ICLR 2019 • Paulina Grnarova, Kfir. Y. Levy, Aurelien Lucchi, Nathanael Perraudin, Thomas Hofmann, Andreas Krause
Generative Adversarial Networks (GANs) have shown great results in accurately modeling complex distributions, but their training is known to be difficult due to instabilities caused by a challenging minimax optimization problem.
1 code implementation • NeurIPS 2019 • Paulina Grnarova, Kfir. Y. Levy, Aurelien Lucchi, Nathanael Perraudin, Ian Goodfellow, Thomas Hofmann, Andreas Krause
Evaluations are essential for: (i) relative assessment of different models and (ii) monitoring the progress of a single model throughout training.
1 code implementation • 7 Jul 2018 • Joachim Muth, Stefan Uhlich, Nathanael Perraudin, Thomas Kemp, Fabien Cardinaux, Yuki Mitsufuji
Music source separation with deep neural networks typically relies only on amplitude features.
no code implementations • 22 Jul 2016 • Nathanael Perraudin, Nicki Holighaus, Piotr Majdak, Peter Balazs
We present a novel method for the compensation of long duration data loss in audio signals, in particular music.
no code implementations • 12 Jul 2016 • Andreas Loukas, Nathanael Perraudin
An emerging way of tackling the dimensionality issues arising in the modeling of a multivariate process is to assume that the inherent data structure can be captured by a graph.
no code implementations • 22 Jun 2016 • Nathanael Perraudin, Andreas Loukas, Francesco Grassi, Pierre Vandergheynst
Graph-based methods for signal processing have shown promise for the analysis of data exhibiting irregular structure, such as those found in social, transportation, and sensor networks.
no code implementations • 21 Jun 2016 • Francesco Grassi, Nathanael Perraudin, Benjamin Ricaud
Graph Signal Processing generalizes classical signal processing to signal or data indexed by the vertices of a weighted graph.
no code implementations • 18 May 2016 • Nauman Shahid, Nathanael Perraudin, Pierre Vandergheynst
Many real world datasets subsume a linear or non-linear low-rank structure in a very low-dimensional space.
no code implementations • 10 Mar 2016 • Nathanael Perraudin, Benjamin Ricaud, David Shuman, Pierre Vandergheynst
Accordingly, we suggest a new way to incorporate a notion of locality, and develop local uncertainty principles that bound the concentration of the analysis coefficients of each atom of a localized graph spectral filter frame in terms of quantities that depend on the local structure of the graph around the center vertex of the given atom.
no code implementations • 5 Feb 2016 • Nauman Shahid, Nathanael Perraudin, Gilles Puy, Pierre Vandergheynst
We introduce a novel framework for an approxi- mate recovery of data matrices which are low-rank on graphs, from sampled measurements.
no code implementations • 15 Sep 2015 • Ana Susnjara, Nathanael Perraudin, Daniel Kressner, Pierre Vandergheynst
Signal-processing on graphs has developed into a very active field of research during the last decade.
Numerical Analysis
no code implementations • 29 Jul 2015 • Nauman Shahid, Nathanael Perraudin, Vassilis Kalofolias, Gilles Puy, Pierre Vandergheynst
Clustering experiments on 7 benchmark datasets with different types of corruptions and background separation experiments on 3 video datasets show that our proposed model outperforms 10 state-of-the-art dimensionality reduction models.
no code implementations • 4 Feb 2014 • Nathanael Perraudin, Vassilis Kalofolias, David Shuman, Pierre Vandergheynst
Convex optimization is an essential tool for machine learning, as many of its problems can be formulated as minimization problems of specific objective functions.