Normalising Flows

20 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Block Neural Autoregressive Flow

nicola-decao/BNAF 9 Apr 2019

Recently, as an alternative to hand-crafted bijections, Huang et al. (2018) proposed neural autoregressive flow (NAF) which is a universal approximator for density functions.

MoGlow: Probabilistic and controllable motion synthesis using normalising flows

chaiyujin/glow-pytorch 16 May 2019

Data-driven modelling and synthesis of motion is an active research area with applications that include animation, games, and social robotics.

Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows

jrmcornish/cif ICML 2020

We show that normalising flows become pathological when used to model targets whose supports have complicated topologies.

Deep Structural Causal Models for Tractable Counterfactual Inference

biomedia-mira/deepscm NeurIPS 2020

We formulate a general framework for building structural causal models (SCMs) with deep learning components.

Implicit Weight Uncertainty in Neural Networks

pawni/BayesByHypernet 3 Nov 2017

Modern neural networks tend to be overconfident on unseen, noisy or incorrectly labelled data and do not produce meaningful uncertainty measures.

OverFlow: Putting flows on top of neural transducers for better TTS

coqui-ai/TTS 13 Nov 2022

Neural HMMs are a type of neural transducer recently proposed for sequence-to-sequence modelling in text-to-speech.

The Neural Moving Average Model for Scalable Variational Inference of State Space Models

Tom-Ryder/VIforSSMs 2 Oct 2019

Variational inference has had great success in scaling approximate Bayesian inference to big data by exploiting mini-batch training.

VFlow: More Expressive Generative Flows with Variational Data Augmentation

thu-ml/vflow ICML 2020

Generative flows are promising tractable models for density modeling that define probabilistic distributions with invertible transformations.

Woodbury Transformations for Deep Generative Flows

yolu1055/WoodburyTransformations NeurIPS 2020

In this paper, we introduce Woodbury transformations, which achieve efficient invertibility via the Woodbury matrix identity and efficient determinant calculation via Sylvester's determinant identity.

Style-Controllable Speech-Driven Gesture Synthesis Using Normalising Flows

simonalexanderson/StyleGestures Computer Graphics Forum 2020

In interactive scenarios, systems for generating natural animations on the fly are key to achieving believable and relatable characters.