Search Results for author: Manfred Opper

Found 45 papers, 7 papers with code

A Convergence Analysis of Approximate Message Passing with Non-Separable Functions and Applications to Multi-Class Classification

no code implementations13 Feb 2024 Burak Çakmak, Yue M. Lu, Manfred Opper

Motivated by the recent application of approximate message passing (AMP) to the analysis of convex optimizations in multi-class classifications [Loureiro, et.

Multi-class Classification

Variational Inference for SDEs Driven by Fractional Noise

no code implementations19 Oct 2023 Rembert Daems, Manfred Opper, Guillaume Crevecoeur, Tolga Birdal

In this paper, building upon the Markov approximation of fBM, we derive the evidence lower bound essential for efficient variational inference of posterior path measures, drawing from the well-established field of stochastic analysis.

Variational Inference Video Prediction

Analysis of Random Sequential Message Passing Algorithms for Approximate Inference

no code implementations16 Feb 2022 Burak Çakmak, Yue M. Lu, Manfred Opper

We analyze the dynamics of a random sequential message passing algorithm for approximate inference with large Gaussian latent variable models in a student-teacher scenario.

Adaptive Inducing Points Selection For Gaussian Processes

no code implementations21 Jul 2021 Théo Galy-Fajou, Manfred Opper

Gaussian Processes (\textbf{GPs}) are flexible non-parametric models with strong probabilistic interpretation.

Gaussian Processes Time Series +1

Nonlinear Hawkes Process with Gaussian Process Self Effects

no code implementations20 May 2021 Noa Malem-Shinitski, Cesar Ojeda, Manfred Opper

We continue the line of work of Bayesian inference for Hawkes processes, and our approach dispenses with the necessity of estimating a branching structure for the posterior, as we perform inference on an aggregated sum of Gaussian Processes.

Bayesian Inference Data Augmentation +3

Exact solution to the random sequential dynamics of a message passing algorithm

no code implementations5 Jan 2021 Burak Çakmak, Manfred Opper

We analyze the random sequential dynamics of a message passing algorithm for Ising models with random interactions in the large system limit.

Evidence Estimation by Kullback-Leibler Integration for Flow-Based Methods

no code implementations pproximateinference AABI Symposium 2021 Nikolai Zaki, Théo Galy-Fajou, Manfred Opper

Flow-based methods such as Stein Variational Gradient Descent caught a lot of interest due to their flexibility and the strong theory going with them.

A Dynamical Mean-Field Theory for Learning in Restricted Boltzmann Machines

no code implementations4 May 2020 Burak Çakmak, Manfred Opper

We define a message-passing algorithm for computing magnetizations in Restricted Boltzmann machines, which are Ising models on bipartite graphs introduced as neural network models for probability distributions over spin configurations.

A Mathematical Model of Local and Global Attention in Natural Scene Viewing

1 code implementation9 Apr 2020 Noa Malem-Shinitski, Manfred Opper, Sebastian Reich, Lisa Schwetlick, Stefan A. Seelig, Ralf Engbert

Thus, the main contribution of our modeling approach is two--fold; first, we propose a new model for saccade generation in scene viewing.

Neurons and Cognition

Understanding the dynamics of message passing algorithms: a free probability heuristics

no code implementations3 Feb 2020 Manfred Opper, Burak Çakmak

We use freeness assumptions of random matrix theory to analyze the dynamical behavior of inference algorithms for probabilistic models with dense coupling matrices in the limit of large systems.

Analysis of Bayesian Inference Algorithms by the Dynamical Functional Approach

no code implementations14 Jan 2020 Burak Çakmak, Manfred Opper

We analyze the dynamics of an algorithm for approximate inference with large Gaussian latent variable models in a student-teacher scenario.

Bayesian Inference

Multi-Class Gaussian Process Classification Made Conjugate: Efficient Inference via Data Augmentation

3 code implementations23 May 2019 Théo Galy-Fajou, Florian Wenzel, Christian Donner, Manfred Opper

We propose a new scalable multi-class Gaussian process classification approach building on a novel modified softmax likelihood function.

Bayesian Inference Data Augmentation +2

Memory-free dynamics for the TAP equations of Ising models with arbitrary rotation invariant ensembles of random coupling matrices

no code implementations24 Jan 2019 Burak Çakmak, Manfred Opper

We propose an iterative algorithm for solving the Thouless-Anderson-Palmer (TAP) equations of Ising models with arbitrary rotation invariant (random) coupling matrices.

Efficient Bayesian Inference of Sigmoidal Gaussian Cox Processes

1 code implementation Journal of Machine Learning Research 2018 Christian Donner, Manfred Opper

We present an approximate Bayesian inference approach for estimating the intensity of an inhomogeneous Poisson process, where the intensity function is modelled using a Gaussian process (GP) prior via a sigmoid link function.

Bayesian Inference

Efficient Gaussian Process Classification Using Polya-Gamma Data Augmentation

3 code implementations18 Feb 2018 Florian Wenzel, Theo Galy-Fajou, Christan Donner, Marius Kloft, Manfred Opper

We propose a scalable stochastic variational approach to GP classification building on Polya-Gamma data augmentation and inducing points.

Classification Data Augmentation +1

Expectation Propagation for Approximate Inference: Free Probability Framework

no code implementations16 Jan 2018 Burak Çakmak, Manfred Opper

We study asymptotic properties of expectation propagation (EP) -- a method for approximate inference originally developed in the field of machine learning.

Perturbative Black Box Variational Inference

no code implementations NeurIPS 2017 Robert Bamler, Cheng Zhang, Manfred Opper, Stephan Mandt

Black box variational inference (BBVI) with reparameterization gradients triggered the exploration of divergence measures other than the Kullback-Leibler (KL) divergence, such as alpha divergences.

Gaussian Processes Variational Inference

Inverse Ising problem in continuous time: A latent variable approach

1 code implementation4 Sep 2017 Christian Donner, Manfred Opper

For data which are simulated from a more biologically plausible network with spiking neurons, we show that the Ising model captures well the low order statistics of the data and how the Ising couplings are related to the underlying synaptic structure of the simulated network.

Bayesian Inference

A statistical physics approach to learning curves for the Inverse Ising problem

no code implementations15 May 2017 Ludovica Bachschmid-Romano, Manfred Opper

Assuming that the data are generated from a true Ising model, we compute the reconstruction error of the couplings using a combination of the replica method with the cavity approach for densely connected systems.

Approximate Bayes learning of stochastic differential equations

no code implementations17 Feb 2017 Philipp Batz, Andreas Ruttor, Manfred Opper

We introduce a nonparametric approach for estimating drift and diffusion functions in systems of stochastic differential equations from observations of the state vector.

Gaussian Processes regression

Optimal Encoding and Decoding for Point Process Observations: an Approximate Closed-Form Filter

no code implementations12 Sep 2016 Yuval Harel, Ron Meir, Manfred Opper

The process of dynamic state estimation (filtering) based on point process observations is in general intractable.

Self-Averaging Expectation Propagation

no code implementations23 Aug 2016 Burak Çakmak, Manfred Opper, Bernard H. Fleury, Ole Winther

Our approach extends the framework of (generalized) approximate message passing -- assumes zero-mean iid entries of the measurement matrix -- to a general class of random matrix ensembles.

Bayesian Inference

Variational perturbation and extended Plefka approaches to dynamics on random networks: the case of the kinetic Ising model

no code implementations28 Jul 2016 Ludovica Bachschmid-Romano, Claudia Battistin, Manfred Opper, Yasser Roudi

We first briefly consider the variational approach based on minimizing the Kullback-Leibler divergence between independent trajectories and the real ones and note that this approach only coincides with the mean field equations from the saddle point approximation to the generating functional when the dynamics is defined through a logistic link function, which is the case for the kinetic Ising model with parallel update.

Expectation propagation for continuous time stochastic processes

no code implementations18 Dec 2015 Botond Cseke, David Schnoerr, Manfred Opper, Guido Sanguinetti

We consider the inverse problem of reconstructing the posterior measure over the trajec- tories of a diffusion process from discrete time observations and continuous time constraints.

A Tractable Approximation to Optimal Point Process Filtering: Application to Neural Encoding

no code implementations NeurIPS 2015 Yuval Harel, Ron Meir, Manfred Opper

The process of dynamic state estimation (filtering) based on point process observations is in general intractable.

An Analytically Tractable Bayesian Approximation to Optimal Point Process Filtering

no code implementations28 Jul 2015 Yuval Harel, Ron Meir, Manfred Opper

The process of dynamic state estimation (filtering) based on point process observations is in general intractable.

Optimal Neural Codes for Control and Estimation

no code implementations NeurIPS 2014 Alex K. Susemihl, Ron Meir, Manfred Opper

Within the framework of optimal Control Theory, one is usually given a cost function which is minimized by selecting a control law based on the observations.

Decision Making

Expectation Propagation

no code implementations22 Sep 2014 Jack Raymond, Andre Manoel, Manfred Opper

Variational inference is a powerful concept that underlies many iterative approximation algorithms; expectation propagation, mean-field methods and belief propagations were all central themes at the school that can be perceived from this unifying framework.

Variational Inference

Optimal Population Codes for Control and Estimation

no code implementations27 Jun 2014 Alex Susemihl, Ron Meir, Manfred Opper

Within the framework of optimal Control Theory, one is usually given a cost function which is minimized by selecting a control law based on the observations.

Approximate Gaussian process inference for the drift function in stochastic differential equations

no code implementations NeurIPS 2013 Andreas Ruttor, Philipp Batz, Manfred Opper

We introduce a nonparametric approach for estimating drift functions in systems of stochastic differential equations from incomplete observations of the state vector.

regression

Approximate inference in latent Gaussian-Markov models from continuous time observations

no code implementations NeurIPS 2013 Botond Cseke, Manfred Opper, Guido Sanguinetti

We propose an approximate inference algorithm for continuous time Gaussian-Markov process models with both discrete and continuous time likelihoods.

Visualizing the Effects of a Changing Distance on Data Using Continuous Embeddings

1 code implementation8 Nov 2013 Gina Gruenhage, Manfred Opper, Simon Barthelme

The right scale is hard to pin down and it is preferable when results do not depend too tightly on the exact value one picked.

Clustering Dimensionality Reduction +1

Temporal Autoencoding Improves Generative Models of Time Series

no code implementations12 Sep 2013 Chris Häusler, Alex Susemihl, Martin P Nawrot, Manfred Opper

This leads to a significant improvement in the performance of both models in a filling-in-frames task across a number of datasets.

Denoising Time Series +2

Perturbative Corrections for Approximate Inference in Gaussian Latent Variable Models

no code implementations12 Jan 2013 Manfred Opper, Ulrich Paquet, Ole Winther

A perturbative expansion is made of the exact but intractable correction, and can be applied to the model's partition function and other moments of interest.

Inference in continuous-time change-point models

no code implementations NeurIPS 2011 Florian Stimberg, Manfred Opper, Guido Sanguinetti, Andreas Ruttor

We consider the problem of Bayesian inference for continuous time multi-stable stochastic systems which can change both their diffusion and drift parameters at discrete times.

Bayesian Inference valid

Approximate inference in continuous time Gaussian-Jump processes

no code implementations NeurIPS 2010 Manfred Opper, Andreas Ruttor, Guido Sanguinetti

We present a novel approach to inference in conditionally Gaussian continuous time stochastic processes, where the latent process is a Markovian jump process.

Gaussian Processes

Improving on Expectation Propagation

no code implementations NeurIPS 2008 Manfred Opper, Ulrich Paquet, Ole Winther

We develop as series of corrections to Expectation Propagation (EP), which is one of the most popular methods for approximate probabilistic inference.

Variational Inference for Diffusion Processes

no code implementations NeurIPS 2007 Cédric Archambeau, Manfred Opper, Yuan Shen, Dan Cornford, John S. Shawe-Taylor

Diffusion processes are a family of continuous-time continuous-state stochastic processes that are in general only partially observed.

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.