Browse > Methodology > Bayesian Inference

Bayesian Inference

64 papers with code · Methodology

Bayesian Inference is a methodology that employs Bayes Rule to estimate parameters (and their full posterior).

State-of-the-art leaderboards

No evaluation results yet. Help compare methods by submit evaluation metrics.

Greatest papers with code

Weight Uncertainty in Neural Networks

20 May 2015tensorflow/models

We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop. It regularises the weights by minimising a compression cost, known as the variational free energy or the expected lower bound on the marginal likelihood.

BAYESIAN INFERENCE

ZhuSuan: A Library for Bayesian Deep Learning

18 Sep 2017thu-ml/zhusuan

In this paper we introduce ZhuSuan, a python probabilistic programming library for Bayesian deep learning, which conjoins the complimentary advantages of Bayesian methods and deep learning. ZhuSuan is built upon Tensorflow.

PROBABILISTIC PROGRAMMING

Semi-Supervised Learning with Deep Generative Models

NeurIPS 2014 probtorch/probtorch

The ever-increasing size of modern data sets combined with the difficulty of obtaining label information has made semi-supervised learning one of the problems of significant practical importance in modern data analysis. We revisit the approach to semi-supervised learning with generative models and develop new models that allow for effective generalisation from small labelled data sets to large unlabelled ones.

BAYESIAN INFERENCE

A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

NeurIPS 2016 martin-gorner/tensorflow-rnn-shakespeare

Yet a major difficulty with these models is their tendency to overfit, with dropout shown to fail when applied to recurrent layers. Recent results at the intersection of Bayesian modelling and deep learning offer a Bayesian interpretation of common deep learning techniques such as dropout.

BAYESIAN INFERENCE LANGUAGE MODELLING SENTIMENT ANALYSIS

Bayesian regression and Bitcoin

6 Oct 2014panditanvita/BTCpredictor

In this paper, we discuss the method of Bayesian regression and its efficacy for predicting price variation of Bitcoin, a recently popularized virtual, cryptographic currency. Bayesian regression refers to utilizing empirical data as proxy to perform Bayesian inference.

BAYESIAN INFERENCE

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

NeurIPS 2016 DartML/Stein-Variational-Gradient-Descent

We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization. Our method iteratively transports a set of particles to match the target distribution, by applying a form of functional gradient descent that minimizes the KL divergence.

BAYESIAN INFERENCE

Bayesian Convolutional Neural Networks with Variational Inference

15 Jun 2018kumar-shridhar/PyTorch-BayesianCNN

We introduce Bayesian convolutional neural networks with variational inference, a variant of convolutional neural networks (CNNs), in which the intractable posterior probability distributions over weights are inferred by Bayes by Backprop. We demonstrate how this reliable variational inference method can serve as a fundamental construct for various network architectures.

BAYESIAN INFERENCE

Stochastic Backpropagation and Approximate Inference in Deep Generative Models

16 Jan 2014yburda/iwae

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent approximate posterior distributions, and that acts as a stochastic encoder of the data.

BAYESIAN INFERENCE

Variational Autoencoders for Collaborative Filtering

16 Feb 2018dawenl/vae_cf

This non-linear probabilistic model enables us to go beyond the limited modeling capacity of linear factor models which still largely dominate collaborative filtering research.We introduce a generative model with multinomial likelihood and use Bayesian inference for parameter estimation. We also provide extended experiments comparing the multinomial likelihood with other commonly used likelihood functions in the latent factor collaborative filtering literature and show favorable results.

BAYESIAN INFERENCE COLLABORATIVE FILTERING LANGUAGE MODELLING

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

6 Jun 2015mrahtz/learning-from-human-preferences

In comparison, Bayesian models offer a mathematically grounded framework to reason about model uncertainty, but usually come with a prohibitive computational cost. Various network architectures and non-linearities are assessed on tasks of regression and classification, using MNIST as an example.

BAYESIAN INFERENCE