Density Estimation

416 papers with code • 14 benchmarks • 14 datasets

The goal of Density Estimation is to give an accurate description of the underlying probabilistic density distribution of an observable data set with unknown density.

Source: Contrastive Predictive Coding Based Feature for Automatic Speaker Verification

Libraries

Use these libraries to find Density Estimation models and implementations

Most implemented papers

PointConv: Deep Convolutional Networks on 3D Point Clouds

DylanWusee/pointconv CVPR 2019

Besides, our experiments converting CIFAR-10 into a point cloud showed that networks built on PointConv can match the performance of convolutional networks in 2D images of a similar structure.

Neural Spline Flows

bayesiains/nsf NeurIPS 2019

A normalizing flow models a complex probability density as an invertible transformation of a simple base density.

PixelCNN++: Improving the PixelCNN with Discretized Logistic Mixture Likelihood and Other Modifications

openai/pixel-cnn 19 Jan 2017

1) We use a discretized logistic mixture likelihood on the pixels, rather than a 256-way softmax, which we find to speed up training.

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

rtqichen/ffjord ICLR 2019

The result is a continuous-time invertible generative model with unbiased density estimation and one-pass sampling, while allowing unrestricted neural network architectures.

PixelSNAIL: An Improved Autoregressive Generative Model

neocxi/pixelsnail-public ICML 2018

Autoregressive generative models consistently achieve the best results in density estimation tasks involving high dimensional data, such as images or audio.

It's Raw! Audio Generation with State-Space Models

hazyresearch/state-spaces 20 Feb 2022

SaShiMi yields state-of-the-art performance for unconditional waveform generation in the autoregressive setting.

Representation Learning: A Review and New Perspectives

clvrai/representation-learning-by-learning-to-count 24 Jun 2012

The success of machine learning algorithms generally depends on data representation, and we hypothesize that this is because different representations can entangle and hide more or less the different explanatory factors of variation behind the data.

The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables

tensorflow/models 2 Nov 2016

The essence of the trick is to refactor each stochastic node into a differentiable function of its parameters and a random variable with fixed distribution.

Neural Autoregressive Flows

CW-Huang/NAF ICML 2018

Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF).

Invertible Residual Networks

jhjacobsen/invertible-resnet 2 Nov 2018

We show that standard ResNet architectures can be made invertible, allowing the same model to be used for classification, density estimation, and generation.