Search Results for author: Ullrich Köthe

Found 18 papers, 10 papers with code

Conditional Invertible Neural Networks for Diverse Image-to-Image Translation

1 code implementation5 May 2021 Lynton Ardizzone, Jakob Kruse, Carsten Lüth, Niels Bracher, Carsten Rother, Ullrich Köthe

We introduce a new architecture called a conditional invertible neural network (cINN), and use it to address the task of diverse image-to-image translation for natural images.

Colorization Image-to-Image Translation +1

Benchmarking Invertible Architectures on Inverse Problems

no code implementations26 Jan 2021 Jakob Kruse, Lynton Ardizzone, Carsten Rother, Ullrich Köthe

Recent work demonstrated that flow-based invertible neural networks are promising tools for solving ambiguous inverse problems.

Invertible Neural Networks for Uncertainty Quantification in Photoacoustic Imaging

no code implementations10 Nov 2020 Jan-Hinrich Nölke, Tim Adler, Janek Gröhl, Thomas Kirchner, Lynton Ardizzone, Carsten Rother, Ullrich Köthe, Lena Maier-Hein

Multispectral photoacoustic imaging (PAI) is an emerging imaging modality which enables the recovery of functional tissue parameters such as blood oxygenation.

Amortized Bayesian model comparison with evidential deep learning

1 code implementation22 Apr 2020 Stefan T. Radev, Marco D'Alessandro, Ulf K. Mertens, Andreas Voss, Ullrich Köthe, Paul-Christian Bürkner

This makes the method particularly effective in scenarios where model fit needs to be assessed for a large number of datasets, so that per-dataset inference is practically infeasible. Finally, we propose a novel way to measure epistemic uncertainty in model comparison problems.

BayesFlow: Learning complex stochastic models with invertible neural networks

1 code implementation13 Mar 2020 Stefan T. Radev, Ulf K. Mertens, Andreass Voss, Lynton Ardizzone, Ullrich Köthe

In addition, our method incorporates a summary network trained to embed the observed data into maximally informative summary statistics.

Bayesian Inference Epidemiology

Training Normalizing Flows with the Information Bottleneck for Competitive Generative Classification

3 code implementations NeurIPS 2020 Lynton Ardizzone, Radek Mackowiak, Carsten Rother, Ullrich Köthe

In this work, firstly, we develop the theory and methodology of IB-INNs, a class of conditional normalizing flows where INNs are trained using the IB objective: Introducing a small amount of {\em controlled} information loss allows for an asymptotically exact formulation of the IB, while keeping the INN's generative capabilities intact.

General Classification Out-of-Distribution Detection

Disentanglement by Nonlinear ICA with General Incompressible-flow Networks (GIN)

1 code implementation ICLR 2020 Peter Sorrenson, Carsten Rother, Ullrich Köthe

Furthermore, the recovered informative latent variables will be in one-to-one correspondence with the true latent variables of the generating process, up to a trivial component-wise transformation.

Representation Learning

Object Segmentation using Pixel-wise Adversarial Loss

no code implementations23 Sep 2019 Ricard Durall, Franz-Josef Pfreundt, Ullrich Köthe, Janis Keuper

Recent deep learning based approaches have shown remarkable success on object segmentation tasks.

Semantic Segmentation

HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference

1 code implementation25 May 2019 Jakob Kruse, Gianluca Detommaso, Ullrich Köthe, Robert Scheichl

Many recent invertible neural architectures are based on coupling block designs where variables are divided in two subsets which serve as inputs of an easily invertible (usually affine) triangular transformation.

Bayesian Inference Density Estimation

The Mutex Watershed and its Objective: Efficient, Parameter-Free Graph Partitioning

no code implementations25 Apr 2019 Steffen Wolf, Alberto Bailoni, Constantin Pape, Nasim Rahaman, Anna Kreshuk, Ullrich Köthe, Fred A. Hamprecht

Unlike seeded watershed, the algorithm can accommodate not only attractive but also repulsive cues, allowing it to find a previously unspecified number of segments without the need for explicit seeds or a tunable threshold.

graph partitioning

Uncertainty-aware performance assessment of optical imaging modalities with invertible neural networks

no code implementations8 Mar 2019 Tim J. Adler, Lynton Ardizzone, Anant Vemuri, Leonardo Ayala, Janek Gröhl, Thomas Kirchner, Sebastian Wirkert, Jakob Kruse, Carsten Rother, Ullrich Köthe, Lena Maier-Hein

Assessment of the specific hardware used in conjunction with such algorithms, however, has not properly addressed the possibility that the problem may be ill-posed.

Analyzing Inverse Problems with Invertible Neural Networks

2 code implementations ICLR 2019 Lynton Ardizzone, Jakob Kruse, Sebastian Wirkert, Daniel Rahner, Eric W. Pellegrini, Ralf S. Klessen, Lena Maier-Hein, Carsten Rother, Ullrich Köthe

Often, the forward process from parameter- to measurement-space is a well-defined function, whereas the inverse problem is ambiguous: one measurement may map to multiple different sets of parameters.

Learned Watershed: End-to-End Learning of Seeded Segmentation

no code implementations ICCV 2017 Steffen Wolf, Lukas Schott, Ullrich Köthe, Fred Hamprecht

Learned boundary maps are known to outperform hand- crafted ones as a basis for the watershed algorithm.

Cannot find the paper you are looking for? You can Submit a new open access paper.