Generative Models

Sliced Iterative Generator

Introduced by Dai et al. in Sliced Iterative Normalizing Flows

The Sliced Iterative Generator (SIG) is an iterative generative model that is a Normalizing Flow (NF), but shares the advantages of Generative Adversarial Networks (GANs). The model is based on iterative Optimal Transport of a series of 1D slices through the data space, matching on each slice the probability distribution function (PDF) of the samples to the data. To improve the efficiency, the directions of the orthogonal slices are chosen to maximize the PDF difference between the generated samples and the data using Wasserstein distance at each iteration. A patch based approach is adopted to model the images in a hierarchical way, enabling the model to scale well to high dimensions.

Unlike GANs, SIG has a NF structure and allows efficient likelihood evaluations that can be used in downstream tasks. While SIG has a deep neural network architecture, the approach deviates significantly from the current deep learning paradigm, as it does not use concepts such as mini-batching, stochastic gradient descent, gradient back-propagation through deep layers, or non-convex loss function optimization. SIG is very insensitive to hyper-parameter tuning, making it a useful generator tool for ML experts and non-experts alike.

Source: Sliced Iterative Normalizing Flows


Paper Code Results Date Stars


Task Papers Share
Image Generation 4 19.05%
Test 2 9.52%
Translation 1 4.76%
Speaker Identification 1 4.76%
Disentanglement 1 4.76%
Domain Adaptation 1 4.76%
Action Recognition 1 4.76%
Dynamic Time Warping 1 4.76%
One-Shot Learning 1 4.76%