no code implementations • 19 Jan 2022 • Camille Gontier, Simone Carlo Surace, Igor Delvendahl, Martin Müller, Jean-Pascal Pfister
Bayesian Active Learning (BAL) is an efficient framework for learning the parameters of a model, in which input stimuli are selected to maximize the mutual information between the observations and the unknown parameters.
1 code implementation • NeurIPS 2021 • Christian Horvat, Jean-Pascal Pfister
Here we propose a novel method - called Denoising Normalizing Flow (DNF) - that estimates the density on the low-dimensional manifold while learning the manifold as well.
2 code implementations • 25 May 2021 • Christian Horvat, Jean-Pascal Pfister
We also show that, if the embedding dimension is much larger than the manifold dimension, noise in the normal space can be well approximated by Gaussian noise.
1 code implementation • 21 Mar 2019 • Anna Kutschireiter, Simone Carlo Surace, Jean-Pascal Pfister
From there we continue our journey through discrete-time models, which is usually encountered in machine learning, and generalize to and further emphasize continuous-time filtering theory.
Methodology
no code implementations • 18 Sep 2018 • Matthieu Gilson, Jean-Pascal Pfister
The present paper provides exact mathematical expressions for the high-order moments of spiking activity in a recurrently-connected network of linear Hawkes processes.
no code implementations • 1 Nov 2016 • Simone Carlo Surace, Jean-Pascal Pfister
We revisit the problem of estimating the parameters of a partially observed diffusion process, consisting of a hidden state process and an observed process, with a continuous time parameter.
no code implementations • 4 Oct 2014 • Laurence Aitchison, Jannes Jegminat, Jorge Aurelio Menendez, Jean-Pascal Pfister, Alex Pouget, Peter E. Latham
They then use that uncertainty to adjust their learning rates, with more uncertain weights having higher learning rates.
no code implementations • NeurIPS 2011 • Johanni Brea, Walter Senn, Jean-Pascal Pfister
We consider a statistical framework in which recurrent networks of spiking neurons learn to generate spatio-temporal spike patterns.
no code implementations • NeurIPS 2009 • Jean-Pascal Pfister, Peter Dayan, Máté Lengyel
Synapses exhibit an extraordinary degree of short-term malleability, with release probabilities and effective synaptic strengths changing markedly over multiple timescales.