no code implementations • 23 Sep 2023 • Daniel Gaa, Vassillen Chizhov, Pascal Peter, Joachim Weickert, Robin Dirk Adam
In contrast to traditional denoising methods that adapt the operator to the data, our approach adapts the data to the operator.
no code implementations • 15 Sep 2023 • Pascal Peter
In order to shed light on these connections to classical image filtering, we propose a generalised scale-space theory for probabilistic diffusion models.
no code implementations • 17 Mar 2023 • Karl Schrader, Pascal Peter, Niklas Kämper, Joachim Weickert
It trains a mask generation network with the help of a neural inpainting surrogate.
no code implementations • 14 Mar 2023 • Paul Bungert, Pascal Peter, Joachim Weickert
For our blending purposes, we explore several ways to compose drift vector fields based on the derivatives of our input images.
no code implementations • 14 Mar 2023 • Pascal Peter
Probabilistic diffusion models enjoy increasing popularity in the deep learning community.
no code implementations • 30 Aug 2022 • Pascal Peter, Karl Schrader, Tobias Alt, Joachim Weickert
This provides real-time performance with high quality results.
no code implementations • 11 Feb 2022 • Pascal Peter
However, a carefully selected mask of known pixels that yield a high quality inpainting can also act as a sparse image representation.
no code implementations • 6 Oct 2021 • Tobias Alt, Pascal Peter, Joachim Weickert
Diffusion-based inpainting is a powerful tool for the reconstruction of images from sparse data.
no code implementations • 31 Aug 2021 • Tobias Alt, Karl Schrader, Joachim Weickert, Pascal Peter, Matthias Augustin
With only a few small filters, they can achieve the same invariance as existing techniques which require a fine-grained sampling of orientations.
no code implementations • 30 Jul 2021 • Tobias Alt, Karl Schrader, Matthias Augustin, Pascal Peter, Joachim Weickert
We connect these concepts to residual networks, recurrent neural networks, and U-net architectures.
no code implementations • 29 Mar 2021 • Tobias Alt, Pascal Peter, Joachim Weickert, Karl Schrader
We investigate what can be learned from translating numerical algorithms into neural networks.
no code implementations • 18 Mar 2021 • Pascal Peter
Recently, sparsification scale-spaces have been obtained as a sequence of inpainted images by gradually removing known image data.
no code implementations • 1 Feb 2021 • Sarah Andris, Joachim Weickert, Tobias Alt, Pascal Peter
Our codec consistently outperforms JPEG and gives useful indications for successfully developing hybrid codecs further.
no code implementations • 26 Oct 2020 • Rahul Mohideen Kaja Mohideen, Pascal Peter, Joachim Weickert
Inpainting-based compression represents images in terms of a sparse subset of its pixel data.
no code implementations • 24 Aug 2020 • Sarah Andris, Pascal Peter, Rahul Mohideen Kaja Mohideen, Joachim Weickert, Sebastian Hoffmann
Compression methods based on inpainting are an evolving alternative to classical transform-based codecs for still images.
no code implementations • 7 Feb 2020 • Tobias Alt, Joachim Weickert, Pascal Peter
Convolutional neural networks (CNNs) often perform well, but their stability is poorly understood.
no code implementations • 18 Dec 2017 • Tim Dahmen, Patrick Trampert, Pascal Peter, Pinak Bheed, Joachim Weickert, Philipp Slusallek
The approach has the advantage of being agnostic to most modelbased parts of exemplar-based inpainting such as the order in which patches are processed and the cost function used to determine patch similarity.
no code implementations • 20 Jun 2017 • Laurent Hoeltgen, Pascal Peter, Michael Breuß
We are lead to the central question which kind of feature vectors are best suited for image compression.