Jacobian Determinant of Normalizing Flows

12 Feb 2021  ·  Huadong Liao, JiaWei He ·

Normalizing flows learn a diffeomorphic mapping between the target and base distribution, while the Jacobian determinant of that mapping forms another real-valued function. In this paper, we show that the Jacobian determinant mapping is unique for the given distributions, hence the likelihood objective of flows has a unique global optimum. In particular, the likelihood for a class of flows is explicitly expressed by the eigenvalues of the auto-correlation matrix of individual data point, and independent of the parameterization of neural network, which provides a theoretical optimal value of likelihood objective and relates to probabilistic PCA. Additionally, Jacobian determinant is a measure of local volume change and is maximized when MLE is used for optimization. To stabilize normalizing flows training, it is required to maintain a balance between the expansiveness and contraction of volume, meaning Lipschitz constraint on the diffeomorphic mapping and its inverse. With these theoretical results, several principles of designing normalizing flow were proposed. And numerical experiments on highdimensional datasets (such as CelebA-HQ 1024x1024) were conducted to show the improved stability of training.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods