Numerical experiments demonstrate that deep neural networks classifiers progressively separate class distributions around their mean, achieving linear separability.

On the opposite, a soft-thresholding on tight frames can reduce within-class variabilities while preserving class means.

The target measure is generated via a deterministic gradient descent algorithm, so as to match a set of statistics of the given, observed realization.

The covariance of a stationary process $X$ is diagonalized by a Fourier transform.

It is implemented in a deep convolutional network with a homotopy algorithm having an exponential convergence.

The wavelet scattering transform is an invariant signal representation suitable for many signal processing and machine learning applications.

To approximate (interpolate) the marking function, in our baseline approach, we build a statistical regression model of the marks with respect some local point distance representation.

For wavelet filters, we show numerically that signals having sparse wavelet coefficients can be recovered from few phase harmonic correlations, which provide a compressive representation

Generative Adversarial Nets (GANs) and Variational Auto-Encoders (VAEs) provide impressive image generations from Gaussian white noise, but the underlying mathematics are not well understood.

We present a machine learning algorithm for the prediction of molecule properties inspired by ideas from density functional theory.

Multiscale hierarchical convolutional networks are structured deep convolutional networks where layers are indexed by progressively higher dimensional attributes, which are learned from training data.

We propose a new approach to linear ill-posed inverse problems.

Computational Engineering, Finance, and Science

Sparse scattering regressions give state of the art results over two databases of organic planar molecules.

Deep convolutional networks provide state of the art classifications and regressions results over many high-dimensional problems.

We present a new representation of harmonic sounds that linearizes the dynamics of pitch and spectral envelope, while remaining stable to deformations in the time-frequency plane.

We present a novel approach to the regression of quantum mechanical energies based on a scattering transform of an intermediate electron density representation.

Dictionary learning algorithms or supervised deep convolution networks have considerably improved the efficiency of predefined feature representations such as SIFT.

The classification of high-dimensional data defined on graphs is particularly difficult when the graph geometry is unknown.

A rigid-motion scattering computes adaptive invariants along translations and rotations, with a deep convolutional network.

We introduce a two-layer wavelet scattering network, for object classification.

This paper aims at presenting a new approach to the electro-sensing problem using wavelets.

We introduce general scattering transforms as mathematical models of deep neural networks with l2 pooling.

A scattering transform defines a locally translation invariant representation which is stable to time-warping deformations.

Sound Information Theory Information Theory

A wavelet scattering network computes a translation invariant image representation, which is stable to deformations and preserves high frequency information for classification.

A scattering vector is a local descriptor including multiscale and multi-direction co-occurrence information.

A general framework for solving image inverse problems is introduced in this paper.

Cannot find the paper you are looking for? You can
Submit a new open access paper.