Normalising Flows
22 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Normalising Flows
Latest papers
Flexible Tails for Normalising Flows, with Application to the Modelling of Financial Return Data
We propose a transformation capable of altering the tail properties of a distribution, motivated by extreme value theory, which can be used as a layer in a normalizing flow to approximate multivariate heavy tailed distributions.
Probabilistic Classification by Density Estimation Using Gaussian Mixture Model and Masked Autoregressive Flow
A family of density estimators is mixture models, such as Gaussian Mixture Model (GMM) by expectation maximization.
NEnv: Neural Environment Maps for Global Illumination
We propose NEnv, a deep-learning fully-differentiable method, capable of compressing and learning to sample from a single environment map.
Decorrelation using Optimal Transport
Being able to decorrelate a feature space from protected attributes is an area of active research and study in ethics, fairness, and also natural sciences.
HuManiFlow: Ancestor-Conditioned Normalising Flows on SO(3) Manifolds for Human Pose and Shape Distribution Estimation
Monocular 3D human pose and shape estimation is an ill-posed problem since multiple 3D solutions can explain a 2D image of a subject.
OverFlow: Putting flows on top of neural transducers for better TTS
Neural HMMs are a type of neural transducer recently proposed for sequence-to-sequence modelling in text-to-speech.
Variational Gibbs Inference for Statistical Model Estimation from Incomplete Data
We address this gap by introducing variational Gibbs inference (VGI), a new general-purpose method to estimate the parameters of statistical models from incomplete data.
Bootstrap Your Flow
Normalizing flows are flexible, parameterized distributions that can be used to approximate expectations from intractable distributions via importance sampling.
Sinusoidal Flow: A Fast Invertible Autoregressive Flow
Normalising flows offer a flexible way of modelling continuous probability distributions.
Learning the Prediction Distribution for Semi-Supervised Learning with Normalising Flows
In this work, we propose a probabilistically principled general approach to SSL that considers the distribution over label predictions, for labels of different complexity, from "one-hot" vectors to binary vectors and images.