Bayesian Inference

471 papers with code • 0 benchmarks • 6 datasets

Bayesian Inference is a methodology that employs Bayes Rule to estimate parameters (and their full posterior).

Libraries

Use these libraries to find Bayesian Inference models and implementations

Most implemented papers

Weight Uncertainty in Neural Networks

tensorflow/models 20 May 2015

We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop.

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

yaringal/DropoutUncertaintyExps 6 Jun 2015

In comparison, Bayesian models offer a mathematically grounded framework to reason about model uncertainty, but usually come with a prohibitive computational cost.

Semi-Supervised Learning with Deep Generative Models

dpkingma/nips14-ssl NeurIPS 2014

The ever-increasing size of modern data sets combined with the difficulty of obtaining label information has made semi-supervised learning one of the problems of significant practical importance in modern data analysis.

Bayesian regression and Bitcoin

panditanvita/BTCpredictor 6 Oct 2014

In this paper, we discuss the method of Bayesian regression and its efficacy for predicting price variation of Bitcoin, a recently popularized virtual, cryptographic currency.

A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

HKUST-KnowComp/R-Net NeurIPS 2016

Recent results at the intersection of Bayesian modelling and deep learning offer a Bayesian interpretation of common deep learning techniques such as dropout.

Variational Autoencoders for Collaborative Filtering

dawenl/vae_cf 16 Feb 2018

This non-linear probabilistic model enables us to go beyond the limited modeling capacity of linear factor models which still largely dominate collaborative filtering research. We introduce a generative model with multinomial likelihood and use Bayesian inference for parameter estimation.

Variational Dropout and the Local Reparameterization Trick

kumar-shridhar/BayesianConvNet NeurIPS 2015

Our method allows inference of more flexibly parameterized posteriors; specifically, we propose variational dropout, a generalization of Gaussian dropout where the dropout rates are learned, often leading to better models.

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

DartML/Stein-Variational-Gradient-Descent NeurIPS 2016

We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization.

Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

gpapamak/snl 18 May 2018

We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible.

Deep Neural Networks as Gaussian Processes

brain-research/nngp ICLR 2018

As such, previous work has not identified that these kernels can be used as covariance functions for GPs and allow fully Bayesian prediction with a deep neural network.