Search Results for author: Michael Moeller

Found 51 papers, 22 papers with code

Robustness and Exploration of Variational and Machine Learning Approaches to Inverse Problems: An Overview

no code implementations19 Feb 2024 Alexander Auras, Kanchana Vaishnavi Gandikota, Hannah Droege, Michael Moeller

This paper attempts to provide an overview of current approaches for solving inverse problems in imaging using variational methods and machine learning.

Evaluating Adversarial Robustness of Low dose CT Recovery

1 code implementation18 Feb 2024 Kanchana Vaishnavi Gandikota, Paramanand Chandramouli, Hannah Droege, Michael Moeller

Both classical approaches and deep networks are affected by such attacks leading to changes in the visual appearance of localized lesions, for extremely small perturbations.

Adversarial Robustness Computed Tomography (CT)

Temporal Action Localization for Inertial-based Human Activity Recognition

no code implementations27 Nov 2023 Marius Bock, Michael Moeller, Kristof Van Laerhoven

Our results show that state-of-the-art TAL models are able to outperform popular inertial models on 4 out of 6 wearable activity recognition benchmark datasets, with improvements ranging as much as 25% in F1-score.

Human Activity Recognition Temporal Action Localization +1

SIGMA: Scale-Invariant Global Sparse Shape Matching

no code implementations ICCV 2023 Maolin Gao, Paul Roetzer, Marvin Eisenberger, Zorah Lähner, Michael Moeller, Daniel Cremers, Florian Bernard

We propose a novel mixed-integer programming (MIP) formulation for generating precise sparse correspondences for highly non-rigid shapes.

An Evaluation of Zero-Cost Proxies -- from Neural Architecture Performance to Model Robustness

no code implementations18 Jul 2023 Jovita Lukasik, Michael Moeller, Margret Keuper

We are interested in the single prediction task for robustness and the joint multi-objective of clean and robust accuracy.

Feature Importance

Differentiable Sensor Layouts for End-to-End Learning of Task-Specific Camera Parameters

no code implementations28 Apr 2023 Hendrik Sommerhoff, Shashank Agnihotri, Mohamed Saleh, Michael Moeller, Margret Keuper, Andreas Kolb

The success of deep learning is frequently described as the ability to train all parameters of a network on a specific application in an end-to-end fashion.

Semantic Segmentation

WEAR: An Outdoor Sports Dataset for Wearable and Egocentric Activity Recognition

1 code implementation11 Apr 2023 Marius Bock, Hilde Kuehne, Kristof Van Laerhoven, Michael Moeller

Though research has shown the complementarity of camera- and inertial-based data, datasets which offer both egocentric video and inertial-based sensor data remain scarce.

Egocentric Activity Recognition Human Activity Recognition +2

Convergent Data-driven Regularizations for CT Reconstruction

1 code implementation14 Dec 2022 Samira Kabri, Alexander Auras, Danilo Riccio, Hartmut Bauermeister, Martin Benning, Michael Moeller, Martin Burger

The reconstruction of images from their corresponding noisy Radon transform is a typical example of an ill-posed linear inverse problem as arising in the application of computerized tomography (CT).

On Adversarial Robustness of Deep Image Deblurring

no code implementations5 Oct 2022 Kanchana Vaishnavi Gandikota, Paramanand Chandramouli, Michael Moeller

Recent approaches employ deep learning-based solutions for the recovery of a sharp image from its blurry observation.

Adversarial Robustness Deblurring +1

Intrinsic Neural Fields: Learning Functions on Manifolds

1 code implementation15 Mar 2022 Lukas Koestler, Daniel Grittner, Michael Moeller, Daniel Cremers, Zorah Lähner

Neural fields have gained significant attention in the computer vision community due to their excellent performance in novel view synthesis, geometry reconstruction, and generative modeling.

Novel View Synthesis

DARTS for Inverse Problems: a Study on Stability

no code implementations NeurIPS Workshop Deep_Invers 2021 Jonas Geiping, Jovita Lukasik, Margret Keuper, Michael Moeller

Differentiable architecture search (DARTS) is a widely researched tool for neural architecture search, due to its promising results for image classification.

Image Classification Neural Architecture Search

Tutorial on Deep Learning for Human Activity Recognition

1 code implementation13 Oct 2021 Marius Bock, Alexander Hoelzemann, Michael Moeller, Kristof Van Laerhoven

Activity recognition systems that are capable of estimating human activities from wearable inertial sensors have come a long way in the past decades.

Feature Engineering Human Activity Recognition

Stochastic Training is Not Necessary for Generalization

1 code implementation ICLR 2022 Jonas Geiping, Micah Goldblum, Phillip E. Pope, Michael Moeller, Tom Goldstein

It is widely believed that the implicit regularization of SGD is fundamental to the impressive generalization behavior we observe in neural networks.

Data Augmentation

Is Differentiable Architecture Search truly a One-Shot Method?

no code implementations12 Aug 2021 Jonas Geiping, Jovita Lukasik, Margret Keuper, Michael Moeller

In this work, we investigate DAS in a systematic case study of inverse problems, which allows us to analyze these potential benefits in a controlled manner.

Hyperparameter Optimization Image Classification +2

Improving Deep Learning for HAR with shallow LSTMs

1 code implementation2 Aug 2021 Marius Bock, Alexander Hoelzemann, Michael Moeller, Kristof Van Laerhoven

Recent studies in Human Activity Recognition (HAR) have shown that Deep Learning methods are able to outperform classical Machine Learning algorithms.

Human Activity Recognition

Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable Approach for Continuous Markov Random Fields

no code implementations13 Jul 2021 Hartmut Bauermeister, Emanuel Laude, Thomas Möllenhoff, Michael Moeller, Daniel Cremers

In contrast to existing discretizations which suffer from a grid bias, we show that a piecewise polynomial discretization better preserves the continuous nature of our problem.

Stereo Matching

Adiabatic Quantum Graph Matching with Permutation Matrix Constraints

no code implementations8 Jul 2021 Marcel Seelbach Benkner, Vladislav Golyanik, Christian Theobalt, Michael Moeller

In this work, we address such problems with emerging quantum computing technology and propose several reformulations of QAPs as unconstrained problems suitable for efficient execution on quantum hardware.

Graph Matching valid

Q-Match: Iterative Shape Matching via Quantum Annealing

no code implementations ICCV 2021 Marcel Seelbach Benkner, Zorah Lähner, Vladislav Golyanik, Christof Wunderlich, Christian Theobalt, Michael Moeller

Finding shape correspondences can be formulated as an NP-hard quadratic assignment problem (QAP) that becomes infeasible for shapes with high sampling density.

What Doesn't Kill You Makes You Robust(er): How to Adversarially Train against Data Poisoning

1 code implementation26 Feb 2021 Jonas Geiping, Liam Fowl, Gowthami Somepalli, Micah Goldblum, Michael Moeller, Tom Goldstein

Data poisoning is a threat model in which a malicious actor tampers with training data to manipulate outcomes at inference time.

Data Poisoning

Learning Spectral Regularizations for Linear Inverse Problems

no code implementations23 Oct 2020 Hartmut Bauermeister, Martin Burger, Michael Moeller

One of the main challenges in linear inverse problems is that a majority of such problems are ill-posed in the sense that the solution does not depend on the data continuously.

Witches' Brew: Industrial Scale Data Poisoning via Gradient Matching

2 code implementations ICLR 2021 Jonas Geiping, Liam Fowl, W. Ronny Huang, Wojciech Czaja, Gavin Taylor, Michael Moeller, Tom Goldstein

We consider a particularly malicious poisoning attack that is both "from scratch" and "clean label", meaning we analyze an attack that successfully works against new, randomly initialized models, and is nearly imperceptible to humans, all while perturbing only a small fraction of the training data.

Data Poisoning

Exploiting the Logits: Joint Sign Language Recognition and Spell-Correction

no code implementations1 Jul 2020 Christina Runkel, Stefan Dorenkamp, Hartmut Bauermeister, Michael Moeller

We demonstrate that purely learning on softmax inputs in combination with scarce training data yields overfitting as the network learns the inputs by heart.

Gesture Recognition Sign Language Recognition +1

A Simple Domain Shifting Networkfor Generating Low Quality Images

1 code implementation30 Jun 2020 Guruprasad Hegde, Avinash Nittur Ramesh, Kanchana Vaishnavi Gandikota, Roman Obermaisser, Michael Moeller

Deep Learning systems have proven to be extremely successful for image recognition tasks for which significant amounts of training data is available, e. g., on the famous ImageNet dataset.

Classification Domain Adaptation +1

A Generative Model for Generic Light Field Reconstruction

no code implementations13 May 2020 Paramanand Chandramouli, Kanchana Vaishnavi Gandikota, Andreas Goerlitz, Andreas Kolb, Michael Moeller

We develop a generative model conditioned on the central view of the light field and incorporate this as a prior in an energy minimization framework to address diverse light field reconstruction tasks.

Super-Resolution

Fast Convex Relaxations using Graph Discretizations

no code implementations23 Apr 2020 Jonas Geiping, Fjedor Gaede, Hartmut Bauermeister, Michael Moeller

We discuss this methodology in detail and show examples in multi-label segmentation by minimal partitions and stereo estimation, where we demonstrate that the proposed graph discretization can reduce runtime as well as memory consumption of convex relaxations of matching problems by up to a factor of 10.

Optical Flow Estimation Segmentation

Truth or Backpropaganda? An Empirical Investigation of Deep Learning Theory

1 code implementation ICLR 2020 Micah Goldblum, Jonas Geiping, Avi Schwarzschild, Michael Moeller, Tom Goldstein

We empirically evaluate common assumptions about neural networks that are widely held by practitioners and theorists alike.

Learning Theory

Energy Dissipation with Plug-and-Play Priors

no code implementations NeurIPS Workshop Deep_Invers 2019 Hendrik Sommerhoff, Andreas Kolb, Michael Moeller

In this paper we consider the combination of both approaches by projecting the outputs of a plug-and-play denoising network onto the cone of descent directions to a given energy.

Image Denoising

Parametric Majorization for Data-Driven Energy Minimization Methods

1 code implementation ICCV 2019 Jonas Geiping, Michael Moeller

Energy minimization methods are a classical tool in a multitude of computer vision applications.

Controlling Neural Networks via Energy Dissipation

no code implementations ICCV 2019 Michael Moeller, Thomas Möllenhoff, Daniel Cremers

The last decade has shown a tremendous success in solving various computer vision problems with the help of deep learning techniques.

Computed Tomography (CT) Deblurring +2

Convolutional Simplex Projection Network (CSPN) for Weakly Supervised Semantic Segmentation

1 code implementation24 Jul 2018 Rania Briq, Michael Moeller, Juergen Gall

Weakly supervised semantic segmentation has been a subject of increased interest due to the scarcity of fully annotated images.

Segmentation Weakly supervised Semantic Segmentation +1

Are good local minima wide in sparse recovery?

no code implementations21 Jun 2018 Michael Moeller, Otmar Loffeld, Juergen Gall, Felix Krahmer

The idea of compressed sensing is to exploit representations in suitable (overcomplete) dictionaries that allow to recover signals far beyond the Nyquist rate provided that they admit a sparse representation in the respective dictionary.

Lifting Layers: Analysis and Applications

1 code implementation ECCV 2018 Peter Ochs, Tim Meinhardt, Laura Leal-Taixe, Michael Moeller

A lifting layer increases the dimensionality of the input, naturally yields a linear spline when combined with a fully connected layer, and therefore closes the gap between low and high dimensional approximation problems.

Denoising Image Classification

Composite Optimization by Nonconvex Majorization-Minimization

no code implementations20 Feb 2018 Jonas Geiping, Michael Moeller

A popular class of algorithms for solving such problems are majorization-minimization techniques which iteratively approximate the composite nonconvex function by a majorizing function that is easy to minimize.

Super-Resolution

Proximal Backpropagation

1 code implementation ICLR 2018 Thomas Frerix, Thomas Möllenhoff, Michael Moeller, Daniel Cremers

Specifically, we show that backpropagation of a prediction error is equivalent to sequential gradient descent steps on a quadratic penalty energy, which comprises the network activations as variables of the optimization.

Learning Proximal Operators: Using Denoising Networks for Regularizing Inverse Imaging Problems

1 code implementation ICCV 2017 Tim Meinhardt, Michael Moeller, Caner Hazirbas, Daniel Cremers

While variational methods have been among the most powerful tools for solving linear inverse problems in imaging, deep (convolutional) neural networks have recently taken the lead in many challenging benchmarks.

Demosaicking Denoising +1

Multiframe Motion Coupling for Video Super Resolution

1 code implementation23 Nov 2016 Jonas Geiping, Hendrik Dirks, Daniel Cremers, Michael Moeller

The idea of video super resolution is to use different view points of a single scene to enhance the overall resolution and quality.

Motion Estimation Video Super-Resolution

Sublabel-Accurate Convex Relaxation of Vectorial Multilabel Energies

1 code implementation7 Apr 2016 Emanuel Laude, Thomas Möllenhoff, Michael Moeller, Jan Lellmann, Daniel Cremers

Convex relaxations of nonconvex multilabel problems have been demonstrated to produce superior (provably optimal or near-optimal) solutions to a variety of classical computer vision problems.

Color Image Denoising Image Denoising +1

Sublabel-Accurate Relaxation of Nonconvex Energies

2 code implementations CVPR 2016 Thomas Möllenhoff, Emanuel Laude, Michael Moeller, Jan Lellmann, Daniel Cremers

We propose a novel spatially continuous framework for convex relaxations based on functional lifting.

Learning Nonlinear Spectral Filters for Color Image Reconstruction

no code implementations ICCV 2015 Michael Moeller, Julia Diebold, Guy Gilboa, Daniel Cremers

This paper presents the idea of learning optimal filters for color image reconstruction based on a novel concept of nonlinear spectral image decompositions recently proposed by Guy Gilboa.

Image Denoising Image Reconstruction

Nonlinear Spectral Analysis via One-homogeneous Functionals - Overview and Future Prospects

no code implementations5 Oct 2015 Guy Gilboa, Michael Moeller, Martin Burger

We present in this paper the motivation and theory of nonlinear spectral representations, based on convex regularizing functionals.

Collaborative Total Variation: A General Framework for Vectorial TV Models

no code implementations6 Aug 2015 Joan Duran, Michael Moeller, Catalina Sbert, Daniel Cremers

Even after over two decades, the total variation (TV) remains one of the most popular regularizations for image processing problems and has sparked a tremendous amount of research, particularly to move from scalar to vector-valued functions.

Deblurring Denoising

Point-wise Map Recovery and Refinement from Functional Correspondence

no code implementations18 Jun 2015 Emanuele Rodolà, Michael Moeller, Daniel Cremers

Since their introduction in the shape analysis community, functional maps have met with considerable success due to their ability to compactly represent dense correspondences between deformable shapes, with applications ranging from shape matching and image segmentation, to exploration of large shape collections.

Image Segmentation Semantic Segmentation

Variational Depth from Focus Reconstruction

1 code implementation1 Aug 2014 Michael Moeller, Martin Benning, Carola Schönlieb, Daniel Cremers

This paper deals with the problem of reconstructing a depth map from a sequence of differently focused images, also known as depth from focus or shape from focus.

The Primal-Dual Hybrid Gradient Method for Semiconvex Splittings

no code implementations7 Jul 2014 Thomas Möllenhoff, Evgeny Strekalovskiy, Michael Moeller, Daniel Cremers

This paper deals with the analysis of a recent reformulation of the primal-dual hybrid gradient method [Zhu and Chan 2008, Pock, Cremers, Bischof and Chambolle 2009, Esser, Zhang and Chan 2010, Chambolle and Pock 2011], which allows to apply it to nonconvex regularizers as first proposed for truncated quadratic penalization in [Strekalovskiy and Cremers 2014].

Cannot find the paper you are looking for? You can Submit a new open access paper.