NICE, or NonLinear Independent Components Estimation is a framework for modeling complex highdimensional densities. It is based on the idea that a good representation is one in which the data has a distribution that is easy to model. For this purpose, a nonlinear deterministic transformation of the data is learned that maps it to a latent space so as to make the transformed data conform to a factorized distribution, i.e., resulting in independent latent variables. The transformation is parameterised so that computing the determinant of the Jacobian and inverse Jacobian is trivial, yet it maintains the ability to learn complex nonlinear transformations, via a composition of simple building blocks, each based on a deep neural network. The training criterion is simply the exact loglikelihood. The transformation used in NICE is the affine coupling layer without the scale term, known as additive coupling layer:
$$ y_{I_{2}} = x_{I_{2}} + m\left(x_{I_{1}}\right) $$
$$ x_{I_{2}} = y_{I_{2}} + m\left(y_{I_{1}}\right) $$
Source: NICE: Nonlinear Independent Components EstimationPaper  Code  Results  Date  Stars 

Task  Papers  Share 

Time Series  1  25.00% 
Causal Identification  1  25.00% 
Causal Inference  1  25.00% 
Image Generation  1  25.00% 
Component  Type 


Affine Coupling

Bijective Transformation  
Normalizing Flows

Distribution Approximation 