# Bayesian Inference

471 papers with code • 0 benchmarks • 6 datasets

Bayesian Inference is a methodology that employs Bayes Rule to estimate parameters (and their full posterior).

## Benchmarks

These leaderboards are used to track progress in Bayesian Inference
## Libraries

Use these libraries to find Bayesian Inference models and implementations## Datasets

## Most implemented papers

# Weight Uncertainty in Neural Networks

We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop.

# Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

In comparison, Bayesian models offer a mathematically grounded framework to reason about model uncertainty, but usually come with a prohibitive computational cost.

# Semi-Supervised Learning with Deep Generative Models

The ever-increasing size of modern data sets combined with the difficulty of obtaining label information has made semi-supervised learning one of the problems of significant practical importance in modern data analysis.

# Bayesian regression and Bitcoin

In this paper, we discuss the method of Bayesian regression and its efficacy for predicting price variation of Bitcoin, a recently popularized virtual, cryptographic currency.

# A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

Recent results at the intersection of Bayesian modelling and deep learning offer a Bayesian interpretation of common deep learning techniques such as dropout.

# Variational Autoencoders for Collaborative Filtering

This non-linear probabilistic model enables us to go beyond the limited modeling capacity of linear factor models which still largely dominate collaborative filtering research. We introduce a generative model with multinomial likelihood and use Bayesian inference for parameter estimation.

# Variational Dropout and the Local Reparameterization Trick

Our method allows inference of more flexibly parameterized posteriors; specifically, we propose variational dropout, a generalization of Gaussian dropout where the dropout rates are learned, often leading to better models.

# Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization.

# Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible.

# Deep Neural Networks as Gaussian Processes

As such, previous work has not identified that these kernels can be used as covariance functions for GPs and allow fully Bayesian prediction with a deep neural network.