Normalising Flows

16 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Block Neural Autoregressive Flow

nicola-decao/BNAF 9 Apr 2019

Recently, as an alternative to hand-crafted bijections, Huang et al. (2018) proposed neural autoregressive flow (NAF) which is a universal approximator for density functions.

Implicit Weight Uncertainty in Neural Networks

pawni/BayesByHypernet 3 Nov 2017

Modern neural networks tend to be overconfident on unseen, noisy or incorrectly labelled data and do not produce meaningful uncertainty measures.

MoGlow: Probabilistic and controllable motion synthesis using normalising flows

chaiyujin/glow-pytorch 16 May 2019

Data-driven modelling and synthesis of motion is an active research area with applications that include animation, games, and social robotics.

Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows

jrmcornish/cif ICML 2020

We show that normalising flows become pathological when used to model targets whose supports have complicated topologies.

Deep Structural Causal Models for Tractable Counterfactual Inference

biomedia-mira/deepscm NeurIPS 2020

We formulate a general framework for building structural causal models (SCMs) with deep learning components.

The Neural Moving Average Model for Scalable Variational Inference of State Space Models

Tom-Ryder/VIforSSMs 2 Oct 2019

Variational inference has had great success in scaling approximate Bayesian inference to big data by exploiting mini-batch training.

VFlow: More Expressive Generative Flows with Variational Data Augmentation

thu-ml/vflow ICML 2020

Generative flows are promising tractable models for density modeling that define probabilistic distributions with invertible transformations.

Woodbury Transformations for Deep Generative Flows

yolu1055/WoodburyTransformations NeurIPS 2020

In this paper, we introduce Woodbury transformations, which achieve efficient invertibility via the Woodbury matrix identity and efficient determinant calculation via Sylvester's determinant identity.

Style-Controllable Speech-Driven Gesture Synthesis Using Normalising Flows

simonalexanderson/StyleGestures Computer Graphics Forum 2020

In interactive scenarios, systems for generating natural animations on the fly are key to achieving believable and relatable characters.

Robust model training and generalisation with Studentising flows

simonalexanderson/StyleGestures 11 Jun 2020

Normalising flows are tractable probabilistic models that leverage the power of deep learning to describe a wide parametric family of distributions, all while remaining trainable using maximum likelihood.