no code implementations • 31 Jan 2023 • Jonathan Wider, Jakob Kruse, Nils Weitzel, Janica C. Bühler, Ullrich Köthe, Kira Rehfeld
Simulating abundances of stable water isotopologues, i. e. molecules differing in their isotopic composition, within climate models allows for comparisons with proxy data and, thus, for testing hypotheses about past climate and validating climate models under varying climatic conditions.
1 code implementation • 23 Nov 2022 • Lukas Schumacher, Paul-Christian Bürkner, Andreas Voss, Ullrich Köthe, Stefan T. Radev
In its simplest form, such a model entails a hierarchy between a low-level observation model and a high-level transition model.
no code implementations • 25 Oct 2022 • Felix Draxler, Christoph Schnörr, Ullrich Köthe
For the first time, we make a quantitative statement about this kind of convergence: We prove that all coupling-based normalizing flows perform whitening of the data distribution (i. e. diagonalize the covariance matrix) and derive corresponding convergence bounds that show a linear convergence rate in the depth of the flow.
no code implementations • 30 Aug 2022 • Robert Schmier, Ullrich Köthe, Christoph-Nikolas Straehle
We use a self-supervised feature extractor trained on the auxiliary dataset and train a normalizing flow on the extracted features by maximizing the likelihood on in-distribution data and minimizing the likelihood on the auxiliary dataset.
1 code implementation • 29 Jul 2022 • Malte Tölle, Ullrich Köthe, Florian André, Benjamin Meder, Sandy Engelhardt
Manipulation of the latent space leads to a modified image while preserving important details.
no code implementations • CVPR 2022 • Titus Leistner, Radek Mackowiak, Lynton Ardizzone, Ullrich Köthe, Carsten Rother
We argue that this is due current methods only considering a single "true" depth, even when multiple objects at different depths contributed to the color of a single pixel.
2 code implementations • 16 Dec 2021 • Marvin Schmitt, Paul-Christian Bürkner, Ullrich Köthe, Stefan T. Radev
Neural density estimators have proven remarkably powerful in performing efficient simulation-based Bayesian inference in various research domains.
1 code implementation • 5 May 2021 • Lynton Ardizzone, Jakob Kruse, Carsten Lüth, Niels Bracher, Carsten Rother, Ullrich Köthe
We introduce a new architecture called a conditional invertible neural network (cINN), and use it to address the task of diverse image-to-image translation for natural images.
no code implementations • 26 Jan 2021 • Jakob Kruse, Lynton Ardizzone, Carsten Rother, Ullrich Köthe
Recent work demonstrated that flow-based invertible neural networks are promising tools for solving ambiguous inverse problems.
no code implementations • 10 Nov 2020 • Jan-Hinrich Nölke, Tim Adler, Janek Gröhl, Thomas Kirchner, Lynton Ardizzone, Carsten Rother, Ullrich Köthe, Lena Maier-Hein
Multispectral photoacoustic imaging (PAI) is an emerging imaging modality which enables the recovery of functional tissue parameters such as blood oxygenation.
no code implementations • 14 Oct 2020 • Jens Müller, Robert Schmier, Lynton Ardizzone, Carsten Rother, Ullrich Köthe
Standard supervised learning breaks down under data distribution shift.
1 code implementation • 1 Oct 2020 • Stefan T. Radev, Frederik Graw, Simiao Chen, Nico T. Mutters, Vanessa M. Eichel, Till Bärnighausen, Ullrich Köthe
Mathematical models in epidemiology are an indispensable tool to determine the dynamics and important characteristics of infectious diseases.
2 code implementations • CVPR 2021 • Radek Mackowiak, Lynton Ardizzone, Ullrich Köthe, Carsten Rother
Generative classifiers (GCs) are a promising class of models that are said to naturally accomplish these qualities.
1 code implementation • 22 Apr 2020 • Stefan T. Radev, Marco D'Alessandro, Ulf K. Mertens, Andreas Voss, Ullrich Köthe, Paul-Christian Bürkner
This makes the method particularly effective in scenarios where model fit needs to be assessed for a large number of datasets, so that per-dataset inference is practically infeasible. Finally, we propose a novel way to measure epistemic uncertainty in model comparison problems.
1 code implementation • 13 Mar 2020 • Stefan T. Radev, Ulf K. Mertens, Andreass Voss, Lynton Ardizzone, Ullrich Köthe
In addition, our method incorporates a summary network trained to embed the observed data into maximally informative summary statistics.
3 code implementations • NeurIPS 2020 • Lynton Ardizzone, Radek Mackowiak, Carsten Rother, Ullrich Köthe
In this work, firstly, we develop the theory and methodology of IB-INNs, a class of conditional normalizing flows where INNs are trained using the IB objective: Introducing a small amount of {\em controlled} information loss allows for an asymptotically exact formulation of the IB, while keeping the INN's generative capabilities intact.
1 code implementation • ICLR 2020 • Peter Sorrenson, Carsten Rother, Ullrich Köthe
Furthermore, the recovered informative latent variables will be in one-to-one correspondence with the true latent variables of the generating process, up to a trivial component-wise transformation.
no code implementations • 5 Nov 2019 • Tim J. Adler, Leonardo Ayala, Lynton Ardizzone, Hannes G. Kenngott, Anant Vemuri, Beat P. Müller-Stich, Carsten Rother, Ullrich Köthe, Lena Maier-Hein
Multispectral optical imaging is becoming a key tool in the operating room.
no code implementations • 25 Sep 2019 • Lynton Ardizzone, Carsten Lüth, Jakob Kruse, Carsten Rother, Ullrich Köthe
In this work, we address the task of natural image generation guided by a conditioning input.
no code implementations • 23 Sep 2019 • Ricard Durall, Franz-Josef Pfreundt, Ullrich Köthe, Janis Keuper
Recent deep learning based approaches have shown remarkable success on object segmentation tasks.
5 code implementations • 4 Jul 2019 • Lynton Ardizzone, Carsten Lüth, Jakob Kruse, Carsten Rother, Ullrich Köthe
We demonstrate these properties for the tasks of MNIST digit generation and image colorization.
1 code implementation • 25 May 2019 • Jakob Kruse, Gianluca Detommaso, Ullrich Köthe, Robert Scheichl
Many recent invertible neural architectures are based on coupling block designs where variables are divided in two subsets which serve as inputs of an easily invertible (usually affine) triangular transformation.
no code implementations • 25 Apr 2019 • Steffen Wolf, Alberto Bailoni, Constantin Pape, Nasim Rahaman, Anna Kreshuk, Ullrich Köthe, Fred A. Hamprecht
Unlike seeded watershed, the algorithm can accommodate not only attractive but also repulsive cues, allowing it to find a previously unspecified number of segments without the need for explicit seeds or a tunable threshold.
no code implementations • 8 Mar 2019 • Tim J. Adler, Lynton Ardizzone, Anant Vemuri, Leonardo Ayala, Janek Gröhl, Thomas Kirchner, Sebastian Wirkert, Jakob Kruse, Carsten Rother, Ullrich Köthe, Lena Maier-Hein
Assessment of the specific hardware used in conjunction with such algorithms, however, has not properly addressed the possibility that the problem may be ill-posed.
2 code implementations • ICLR 2019 • Lynton Ardizzone, Jakob Kruse, Sebastian Wirkert, Daniel Rahner, Eric W. Pellegrini, Ralf S. Klessen, Lena Maier-Hein, Carsten Rother, Ullrich Köthe
Often, the forward process from parameter- to measurement-space is a well-defined function, whereas the inverse problem is ambiguous: one measurement may map to multiple different sets of parameters.
no code implementations • ICCV 2017 • Steffen Wolf, Lukas Schott, Ullrich Köthe, Fred Hamprecht
Learned boundary maps are known to outperform hand- crafted ones as a basis for the watershed algorithm.