no code implementations • 4 Feb 2025 • Grigory Bartosh, Dmitry Vetrov, Christian A. Naesseth
The Latent Stochastic Differential Equation (SDE) is a powerful tool for time series and sequence modeling.
1 code implementation • 29 Jan 2025 • Alex Chen, Philipe Chlenski, Kenneth Munyuza, Antonio Khalil Moretti, Christian A. Naesseth, Itsik Pe'er
Hyperbolic space naturally encodes hierarchical structures such as phylogenies (binary trees), where inward-bending geodesics reflect paths through least common ancestors, and the exponential growth of neighborhoods mirrors the super-exponential scaling of topologies.
no code implementations • 5 Jun 2024 • Hanming Yang, Antonio Khalil Moretti, Sebastian Macaluso, Philippe Chlenski, Christian A. Naesseth, Itsik Pe'er
Reconstructing jets, which provide vital insights into the properties and histories of subatomic particles produced in high-energy collisions, is a main problem in data analyses in collider physics.
1 code implementation • 31 May 2024 • Metod Jazbec, Alexander Timans, Tin Hadži Veljković, Kaspar Sakmann, Dan Zhang, Christian A. Naesseth, Eric Nalisnick
Scaling machine learning models significantly improves their performance.
1 code implementation • 19 Apr 2024 • Grigory Bartosh, Dmitry Vetrov, Christian A. Naesseth
Conventional diffusion models typically relies on a fixed forward process, which implicitly defines complex marginal distributions over latent variables.
Ranked #1 on
Image Generation
on ImageNet 32x32
(bpd metric)
no code implementations • 14 Mar 2024 • Heiko Zimmermann, Christian A. Naesseth, Jan-Willem van de Meent
We present variational inference with sequential sample-average approximation (VISA), a method for approximate inference in computationally intensive models, such as those based on numerical simulations.
no code implementations • 17 Nov 2023 • Alexander Timans, Christoph-Nikolas Straehle, Kaspar Sakmann, Christian A. Naesseth, Eric Nalisnick
Multiple hypothesis testing (MHT) frequently arises in scientific inquiries, and concurrent testing of multiple hypotheses inflates the risk of Type-I errors or false positives, rendering MHT corrections essential.
no code implementations • 12 Oct 2023 • Grigory Bartosh, Dmitry Vetrov, Christian A. Naesseth
In this paper, we present Neural Diffusion Models (NDMs), a generalization of conventional diffusion models that enables defining and learning time-dependent non-linear transformations of data.
Ranked #6 on
Image Generation
on ImageNet 64x64
(Bits per dim metric)
2 code implementations • NeurIPS 2023 • Luhuan Wu, Brian L. Trippe, Christian A. Naesseth, David M. Blei, John P. Cunningham
Diffusion models have been successful on a range of conditional generation tasks including molecular design and text-to-image generation.
no code implementations • 24 Oct 2022 • Teodora Pandeva, Tim Bakker, Christian A. Naesseth, Patrick Forré
We introduce a powerful deep classifier two-sample test for high-dimensional data based on E-values, called E-value Classifier Two-Sample Test (E-C2ST).
no code implementations • 14 Oct 2022 • Heiko Zimmermann, Fredrik Lindsten, Jan-Willem van de Meent, Christian A. Naesseth
Generative flow networks (GFNs) are a class of models for sequential sampling of composite objects, which approximate a target distribution that is defined in terms of an energy function or a reward.
1 code implementation • 3 Feb 2022 • Liyi Zhang, David M. Blei, Christian A. Naesseth
Variational inference often minimizes the "reverse" Kullbeck-Leibler (KL) KL(q||p) from the approximate distribution q to the posterior p. Recent work studies the "forward" KL KL(p||q), which unlike reverse KL does not lead to variational approximations that underestimate uncertainty.
1 code implementation • 31 May 2021 • Antonio Khalil Moretti, Liyi Zhang, Christian A. Naesseth, Hadiah Venner, David Blei, Itsik Pe'er
Bayesian phylogenetic inference is often conducted via local or sequential search over topologies and branch lengths using algorithms such as random-walk Markov chain Monte Carlo (MCMC) or Combinatorial Sequential Monte Carlo (CSMC).
no code implementations • NeurIPS 2020 • Christian A. Naesseth, Fredrik Lindsten, David Blei
Modern variational inference (VI) uses stochastic gradients to avoid intractable expectations, enabling large-scale probabilistic inference in complex models.
no code implementations • 12 Mar 2019 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations.
1 code implementation • 31 May 2017 • Christian A. Naesseth, Scott W. Linderman, Rajesh Ranganath, David M. Blei
The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior.
no code implementations • 29 Dec 2016 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
Sequential Monte Carlo (SMC) methods comprise one of the most successful approaches to approximate Bayesian filtering.
2 code implementations • 18 Oct 2016 • Christian A. Naesseth, Francisco J. R. Ruiz, Scott W. Linderman, David M. Blei
Variational inference using the reparameterization trick has enabled large-scale approximate Bayesian inference in complex probabilistic models, leveraging stochastic optimization to sidestep intractable expectations.
1 code implementation • 16 Feb 2016 • Tom Rainforth, Christian A. Naesseth, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet, Frank Wood
We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers.
no code implementations • 20 Mar 2015 • Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth, Andreas Svensson, Liang Dai
One of the key challenges in identifying nonlinear and possibly non-Gaussian state space models (SSMs) is the intractability of estimating the system state.
1 code implementation • 9 Feb 2015 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm.
3 code implementations • 19 Jun 2014 • Fredrik Lindsten, Adam M. Johansen, Christian A. Naesseth, Bonnie Kirkpatrick, Thomas B. Schön, John Aston, Alexandre Bouchard-Côté
We propose a novel class of Sequential Monte Carlo (SMC) algorithms, appropriate for inference in probabilistic graphical models.
no code implementations • NeurIPS 2014 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
We propose a new framework for how to use sequential Monte Carlo (SMC) algorithms for inference in probabilistic graphical models (PGM).