no code implementations • 13 Feb 2024 • Burak Çakmak, Yue M. Lu, Manfred Opper
Motivated by the recent application of approximate message passing (AMP) to the analysis of convex optimizations in multi-class classifications [Loureiro, et.
no code implementations • 19 Oct 2023 • Rembert Daems, Manfred Opper, Guillaume Crevecoeur, Tolga Birdal
In this paper, building upon the Markov approximation of fBM, we derive the evidence lower bound essential for efficient variational inference of posterior path measures, drawing from the well-established field of stochastic analysis.
no code implementations • 16 Feb 2022 • Burak Çakmak, Yue M. Lu, Manfred Opper
We analyze the dynamics of a random sequential message passing algorithm for approximate inference with large Gaussian latent variable models in a student-teacher scenario.
no code implementations • 21 Jul 2021 • Théo Galy-Fajou, Manfred Opper
Gaussian Processes (\textbf{GPs}) are flexible non-parametric models with strong probabilistic interpretation.
no code implementations • 20 May 2021 • Noa Malem-Shinitski, Cesar Ojeda, Manfred Opper
We continue the line of work of Bayesian inference for Hawkes processes, and our approach dispenses with the necessity of estimating a branching structure for the posterior, as we perform inference on an aggregated sum of Gaussian Processes.
no code implementations • 5 Jan 2021 • Burak Çakmak, Manfred Opper
We analyze the random sequential dynamics of a message passing algorithm for Ising models with random interactions in the large system limit.
no code implementations • pproximateinference AABI Symposium 2021 • Théo Galy-Fajou, Valerio Perrone, Manfred Opper
Bayesian inference is intractable for most practical problems and requires approximation schemes with several trade-offs.
no code implementations • pproximateinference AABI Symposium 2021 • Nikolai Zaki, Théo Galy-Fajou, Manfred Opper
Flow-based methods such as Stein Variational Gradient Descent caught a lot of interest due to their flexibility and the strong theory going with them.
no code implementations • 4 May 2020 • Burak Çakmak, Manfred Opper
We define a message-passing algorithm for computing magnetizations in Restricted Boltzmann machines, which are Ising models on bipartite graphs introduced as neural network models for probability distributions over spin configurations.
1 code implementation • 9 Apr 2020 • Noa Malem-Shinitski, Manfred Opper, Sebastian Reich, Lisa Schwetlick, Stefan A. Seelig, Ralf Engbert
Thus, the main contribution of our modeling approach is two--fold; first, we propose a new model for saccade generation in scene viewing.
Neurons and Cognition
1 code implementation • 26 Feb 2020 • Théo Galy-Fajou, Florian Wenzel, Manfred Opper
Building on the conjugate structure of the augmented model, we develop two inference methods.
no code implementations • 3 Feb 2020 • Manfred Opper, Burak Çakmak
We use freeness assumptions of random matrix theory to analyze the dynamical behavior of inference algorithms for probabilistic models with dense coupling matrices in the limit of large systems.
no code implementations • 14 Jan 2020 • Burak Çakmak, Manfred Opper
We analyze the dynamics of an algorithm for approximate inference with large Gaussian latent variable models in a student-teacher scenario.
no code implementations • 30 Sep 2019 • Robert Bamler, Cheng Zhang, Manfred Opper, Stephan Mandt
In this paper, we revisit perturbation theory as a powerful way of improving the variational approximation.
3 code implementations • 23 May 2019 • Théo Galy-Fajou, Florian Wenzel, Christian Donner, Manfred Opper
We propose a new scalable multi-class Gaussian process classification approach building on a novel modified softmax likelihood function.
no code implementations • 24 Jan 2019 • Burak Çakmak, Manfred Opper
We propose an iterative algorithm for solving the Thouless-Anderson-Palmer (TAP) equations of Ising models with arbitrary rotation invariant (random) coupling matrices.
1 code implementation • Journal of Machine Learning Research 2018 • Christian Donner, Manfred Opper
We present an approximate Bayesian inference approach for estimating the intensity of an inhomogeneous Poisson process, where the intensity function is modelled using a Gaussian process (GP) prior via a sigmoid link function.
no code implementations • 29 May 2018 • Christian Donner, Manfred Opper
We reconsider a nonparametric density model based on Gaussian processes.
3 code implementations • 18 Feb 2018 • Florian Wenzel, Theo Galy-Fajou, Christan Donner, Marius Kloft, Manfred Opper
We propose a scalable stochastic variational approach to GP classification building on Polya-Gamma data augmentation and inducing points.
no code implementations • 16 Jan 2018 • Burak Çakmak, Manfred Opper
We study asymptotic properties of expectation propagation (EP) -- a method for approximate inference originally developed in the field of machine learning.
no code implementations • NeurIPS 2017 • Robert Bamler, Cheng Zhang, Manfred Opper, Stephan Mandt
Black box variational inference (BBVI) with reparameterization gradients triggered the exploration of divergence measures other than the Kullback-Leibler (KL) divergence, such as alpha divergences.
1 code implementation • 4 Sep 2017 • Christian Donner, Manfred Opper
For data which are simulated from a more biologically plausible network with spiking neurons, we show that the Ising model captures well the low order statistics of the data and how the Ising couplings are related to the underlying synaptic structure of the simulated network.
no code implementations • 15 May 2017 • Ludovica Bachschmid-Romano, Manfred Opper
Assuming that the data are generated from a true Ising model, we compute the reconstruction error of the couplings using a combination of the replica method with the cavity approach for densely connected systems.
no code implementations • 17 Feb 2017 • Philipp Batz, Andreas Ruttor, Manfred Opper
We introduce a nonparametric approach for estimating drift and diffusion functions in systems of stochastic differential equations from observations of the state vector.
no code implementations • 12 Sep 2016 • Yuval Harel, Ron Meir, Manfred Opper
The process of dynamic state estimation (filtering) based on point process observations is in general intractable.
no code implementations • 23 Aug 2016 • Burak Çakmak, Manfred Opper, Bernard H. Fleury, Ole Winther
Our approach extends the framework of (generalized) approximate message passing -- assumes zero-mean iid entries of the measurement matrix -- to a general class of random matrix ensembles.
no code implementations • 28 Jul 2016 • Ludovica Bachschmid-Romano, Claudia Battistin, Manfred Opper, Yasser Roudi
We first briefly consider the variational approach based on minimizing the Kullback-Leibler divergence between independent trajectories and the real ones and note that this approach only coincides with the mean field equations from the saddle point approximation to the generating functional when the dynamics is defined through a logistic link function, which is the case for the kinetic Ising model with parallel update.
no code implementations • 18 Dec 2015 • Botond Cseke, David Schnoerr, Manfred Opper, Guido Sanguinetti
We consider the inverse problem of reconstructing the posterior measure over the trajec- tories of a diffusion process from discrete time observations and continuous time constraints.
no code implementations • NeurIPS 2015 • Yuval Harel, Ron Meir, Manfred Opper
The process of dynamic state estimation (filtering) based on point process observations is in general intractable.
no code implementations • 28 Jul 2015 • Yuval Harel, Ron Meir, Manfred Opper
The process of dynamic state estimation (filtering) based on point process observations is in general intractable.
no code implementations • NeurIPS 2014 • Alex K. Susemihl, Ron Meir, Manfred Opper
Within the framework of optimal Control Theory, one is usually given a cost function which is minimized by selecting a control law based on the observations.
no code implementations • NeurIPS 2014 • Florian Stimberg, Andreas Ruttor, Manfred Opper
We introduce a model where the rate of an inhomogeneous Poisson process is modified by a Chinese restaurant process.
no code implementations • 22 Sep 2014 • Jack Raymond, Andre Manoel, Manfred Opper
Variational inference is a powerful concept that underlies many iterative approximation algorithms; expectation propagation, mean-field methods and belief propagations were all central themes at the school that can be perceived from this unifying framework.
no code implementations • 27 Jun 2014 • Alex Susemihl, Ron Meir, Manfred Opper
Within the framework of optimal Control Theory, one is usually given a cost function which is minimized by selecting a control law based on the observations.
no code implementations • NeurIPS 2013 • Andreas Ruttor, Philipp Batz, Manfred Opper
We introduce a nonparametric approach for estimating drift functions in systems of stochastic differential equations from incomplete observations of the state vector.
no code implementations • NeurIPS 2013 • Botond Cseke, Manfred Opper, Guido Sanguinetti
We propose an approximate inference algorithm for continuous time Gaussian-Markov process models with both discrete and continuous time likelihoods.
1 code implementation • 8 Nov 2013 • Gina Gruenhage, Manfred Opper, Simon Barthelme
The right scale is hard to pin down and it is preferable when results do not depend too tightly on the exact value one picked.
no code implementations • 12 Sep 2013 • Chris Häusler, Alex Susemihl, Martin P Nawrot, Manfred Opper
This leads to a significant improvement in the performance of both models in a filling-in-frames task across a number of datasets.
no code implementations • 12 Jan 2013 • Manfred Opper, Ulrich Paquet, Ole Winther
A perturbative expansion is made of the exact but intractable correction, and can be applied to the model's partition function and other moments of interest.
no code implementations • NeurIPS 2011 • Florian Stimberg, Manfred Opper, Guido Sanguinetti, Andreas Ruttor
We consider the problem of Bayesian inference for continuous time multi-stable stochastic systems which can change both their diffusion and drift parameters at discrete times.
no code implementations • NeurIPS 2011 • Alex K. Susemihl, Ron Meir, Manfred Opper
Bayesian filtering of stochastic stimuli has received a great deal of attention re- cently.
no code implementations • NeurIPS 2010 • Manfred Opper, Andreas Ruttor, Guido Sanguinetti
We present a novel approach to inference in conditionally Gaussian continuous time stochastic processes, where the latent process is a Markovian jump process.
no code implementations • NeurIPS 2008 • Manfred Opper, Ulrich Paquet, Ole Winther
We develop as series of corrections to Expectation Propagation (EP), which is one of the most popular methods for approximate probabilistic inference.
no code implementations • NeurIPS 2007 • Cédric Archambeau, Manfred Opper, Yuan Shen, Dan Cornford, John S. Shawe-Taylor
Diffusion processes are a family of continuous-time continuous-state stochastic processes that are in general only partially observed.
no code implementations • NeurIPS 2007 • Manfred Opper, Guido Sanguinetti
Markov jump processes play an important role in a large number of application domains.