Search Results for author: Jan Stanczuk

Found 8 papers, 2 papers with code

Time Series Diffusion in the Frequency Domain

1 code implementation8 Feb 2024 Jonathan Crabbé, Nicolas Huynh, Jan Stanczuk, Mihaela van der Schaar

We explain this observation by showing that time series from these datasets tend to be more localized in the frequency domain than in the time domain, which makes them easier to model in the former case.

Denoising Inductive Bias +1

Closing the ODE-SDE gap in score-based diffusion models through the Fokker-Planck equation

no code implementations27 Nov 2023 Teo Deveney, Jan Stanczuk, Lisa Maria Kreusser, Chris Budd, Carola-Bibiane Schönlieb

In this paper we rigorously describe the range of dynamics and approximations that arise when training score-based diffusion models, including the true SDE dynamics, the neural approximations, the various approximate particle dynamics that result, as well as their associated Fokker--Planck equations and the neural network approximations of these Fokker--Planck equations.

Variational Diffusion Auto-encoder: Latent Space Extraction from Pre-trained Diffusion Models

no code implementations24 Apr 2023 Georgios Batzolis, Jan Stanczuk, Carola-Bibiane Schönlieb

This issue stems from the unrealistic assumption that approximates the conditional data distribution, $p(\textbf{x} | \textbf{z})$, as an isotropic Gaussian.

Your diffusion model secretly knows the dimension of the data manifold

no code implementations23 Dec 2022 Jan Stanczuk, Georgios Batzolis, Teo Deveney, Carola-Bibiane Schönlieb

A diffusion model approximates the score function i. e. the gradient of the log density of a noise-corrupted version of the target distribution for varying levels of corruption.

Non-Uniform Diffusion Models

no code implementations20 Jul 2022 Georgios Batzolis, Jan Stanczuk, Carola-Bibiane Schönlieb, Christian Etmann

We show that non-uniform diffusion leads to multi-scale diffusion models which have similar structure to this of multi-scale normalizing flows.

Denoising

Conditional Image Generation with Score-Based Diffusion Models

1 code implementation26 Nov 2021 Georgios Batzolis, Jan Stanczuk, Carola-Bibiane Schönlieb, Christian Etmann

Score-based diffusion models have emerged as one of the most promising frameworks for deep generative modelling.

Conditional Image Generation

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)

no code implementations2 Mar 2021 Jan Stanczuk, Christian Etmann, Lisa Maria Kreusser, Carola-Bibiane Schönlieb

Wasserstein GANs are based on the idea of minimising the Wasserstein distance between a real and a generated distribution.

Cannot find the paper you are looking for? You can Submit a new open access paper.