Search Results for author: Sixin Zhang

Found 13 papers, 9 papers with code

On the Nash equilibrium of moment-matching GANs for stationary Gaussian processes

no code implementations14 Mar 2022 Sixin Zhang

Generative Adversarial Networks (GANs) learn an implicit generative model from data samples through a two-player game.

Gaussian Processes

Generalized Rectifier Wavelet Covariance Models For Texture Synthesis

1 code implementation ICLR 2022 Antoine Brochard, Sixin Zhang, Stéphane Mallat

State-of-the-art maximum entropy models for texture synthesis are built from statistics relying on image representations defined by convolutional neural networks (CNN).

Texture Synthesis

Leveraging Joint-Diagonalization in Transform-Learning NMF

1 code implementation10 Dec 2021 Sixin Zhang, Emmanuel Soubies, Cédric Févotte

Non-negative matrix factorization with transform learning (TL-NMF) is a recent idea that aims at learning data representations suited to NMF.

Particle gradient descent model for point process generation

2 code implementations27 Oct 2020 Antoine Brochard, Bartłomiej Błaszczyszyn, Stéphane Mallat, Sixin Zhang

This paper presents a statistical model for stationary ergodic point processes, estimated from a single realization observed in a square window.

Point Processes Topological Data Analysis

Data Assimilation Networks

1 code implementation19 Oct 2020 Pierre Boudier, Anthony Fillion, Serge Gratton, Selime Gürol, Sixin Zhang

Data assimilation (DA) aims at forecasting the state of a dynamical system by combining a mathematical representation of the system with noisy observations taking into account their uncertainties.

BIG-bench Machine Learning

Maximum Entropy Models from Phase Harmonic Covariances

1 code implementation22 Nov 2019 Sixin Zhang, Stéphane Mallat

The covariance of a stationary process $X$ is diagonalized by a Fourier transform.

Statistical learning of geometric characteristics of wireless networks

no code implementations19 Dec 2018 Antoine Brochard, Bartłomiej Błaszczyszyn, Stéphane Mallat, Sixin Zhang

To approximate (interpolate) the marking function, in our baseline approach, we build a statistical regression model of the marks with respect some local point distance representation.

Point Processes regression

Phase Harmonic Correlations and Convolutional Neural Networks

1 code implementation29 Oct 2018 Stéphane Mallat, Sixin Zhang, Gaspar Rochette

For wavelet filters, we show numerically that signals having sparse wavelet coefficients can be recovered from few phase harmonic correlations, which provide a compressive representation

Time Series Time Series Analysis

Distributed stochastic optimization for deep learning (thesis)

no code implementations7 May 2016 Sixin Zhang

We also find a surprising connection between the momentum SGD and the EASGD method with a negative moving average rate.

Image Classification Stochastic Optimization +1

Deep learning with Elastic Averaging SGD

10 code implementations NeurIPS 2015 Sixin Zhang, Anna Choromanska, Yann Lecun

We empirically demonstrate that in the deep learning setting, due to the existence of many local optima, allowing more exploration can lead to the improved performance.

Image Classification Stochastic Optimization

No More Pesky Learning Rates

no code implementations6 Jun 2012 Tom Schaul, Sixin Zhang, Yann Lecun

The performance of stochastic gradient descent (SGD) depends critically on how learning rates are tuned and decreased over time.

Cannot find the paper you are looking for? You can Submit a new open access paper.