1 code implementation • 14 Dec 2022 • Samira Kabri, Alexander Auras, Danilo Riccio, Hartmut Bauermeister, Martin Benning, Michael Moeller, Martin Burger
The reconstruction of images from their corresponding noisy Radon transform is a typical example of an ill-posed linear inverse problem as arising in the application of computerized tomography (CT).
no code implementations • 13 Jul 2021 • Hartmut Bauermeister, Emanuel Laude, Thomas Möllenhoff, Michael Moeller, Daniel Cremers
In contrast to existing discretizations which suffer from a grid bias, we show that a piecewise polynomial discretization better preserves the continuous nature of our problem.
1 code implementation • NeurIPS 2020 • Jonas Geiping, Hartmut Bauermeister, Hannah Dröge, Michael Moeller
The idea of federated learning is to collaboratively train a neural network on a server.
no code implementations • 23 Oct 2020 • Hartmut Bauermeister, Martin Burger, Michael Moeller
One of the main challenges in linear inverse problems is that a majority of such problems are ill-posed in the sense that the solution does not depend on the data continuously.
no code implementations • 1 Jul 2020 • Christina Runkel, Stefan Dorenkamp, Hartmut Bauermeister, Michael Moeller
We demonstrate that purely learning on softmax inputs in combination with scarce training data yields overfitting as the network learns the inputs by heart.
no code implementations • 23 Apr 2020 • Jonas Geiping, Fjedor Gaede, Hartmut Bauermeister, Michael Moeller
We discuss this methodology in detail and show examples in multi-label segmentation by minimal partitions and stereo estimation, where we demonstrate that the proposed graph discretization can reduce runtime as well as memory consumption of convex relaxations of matching problems by up to a factor of 10.
6 code implementations • 31 Mar 2020 • Jonas Geiping, Hartmut Bauermeister, Hannah Dröge, Michael Moeller
The idea of federated learning is to collaboratively train a neural network on a server.