Search Results for author: Michalis Titsias Rc Aueb

Found 5 papers, 0 papers with code

The Generalized Reparameterization Gradient

no code implementations NeurIPS 2016 Francisco R. Ruiz, Michalis Titsias Rc Aueb, David Blei

The reparameterization gradient has become a widely used method to obtain Monte Carlo gradients to optimize the variational objective.

Variational Inference

One-vs-Each Approximation to Softmax for Scalable Estimation of Probabilities

no code implementations NeurIPS 2016 Michalis Titsias Rc Aueb

The softmax representation of probabilities for categorical variables plays a prominent role in modern machine learning with numerous applications in areas such as large scale classification, neural language modeling and recommendation systems.

General Classification Language Modelling +2

Local Expectation Gradients for Black Box Variational Inference

no code implementations NeurIPS 2015 Michalis Titsias Rc Aueb, Miguel Lázaro-Gredilla

This algorithm divides the problem of estimating the stochastic gradients over multiple variational parameters into smaller sub-tasks so that each sub-task explores intelligently the most relevant part of the variational distribution.

Variational Inference

Hamming Ball Auxiliary Sampling for Factorial Hidden Markov Models

no code implementations NeurIPS 2014 Michalis Titsias Rc Aueb, Christopher Yau

We introduce a novel sampling algorithm for Markov chain Monte Carlo-based Bayesian inference for factorial hidden Markov models.

Bayesian Inference

Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression

no code implementations NeurIPS 2013 Michalis Titsias Rc Aueb, Miguel Lazaro-Gredilla

We introduce a novel variational method that allows to approximately integrate out kernel hyperparameters, such as length-scales, in Gaussian process regression.

regression Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.