Search Results for author: David Rügamer

Found 46 papers, 16 papers with code

Post-Training Network Compression for 3D Medical Image Segmentation: Reducing Computational Efforts via Tucker Decomposition

1 code implementation15 Apr 2024 Tobias Weber, Jakob Dexl, David Rügamer, Michael Ingrisch

The application of Tucker decomposition to the TS model substantially reduced the model parameters and FLOPs across various compression rates, with limited loss in segmentation accuracy.

Computational Efficiency Image Segmentation +5

Training Survival Models using Scoring Rules

no code implementations19 Mar 2024 Philipp Kopper, David Rügamer, Raphael Sonabend, Bernd Bischl, Andreas Bender

Survival Analysis provides critical insights for partially incomplete time-to-event data in various domains.

Survival Analysis

Interpretable Machine Learning for TabPFN

1 code implementation16 Mar 2024 David Rundel, Julius Kobialka, Constantin von Crailsheim, Matthias Feurer, Thomas Nagler, David Rügamer

The recently developed Prior-Data Fitted Networks (PFNs) have shown very promising results for applications in low-data regimes.

Data Valuation In-Context Learning +1

Scalable Higher-Order Tensor Product Spline Models

no code implementations2 Feb 2024 David Rügamer

In the current era of vast data and transparent machine learning, it is essential for techniques to operate at a large scale while providing a clear mathematical comprehension of the internal workings of the method.

Bayesian Semi-structured Subspace Inference

no code implementations23 Jan 2024 Daniel Dold, David Rügamer, Beate Sick, Oliver Dürr

To this end, we extend subspace inference for joint posterior sampling from a full parameter space for structured effects and a subspace for unstructured effects.

regression

Unreading Race: Purging Protected Features from Chest X-ray Embeddings

no code implementations2 Nov 2023 Tobias Weber, Michael Ingrisch, Bernd Bischl, David Rügamer

Materials and Methods: An orthogonalization is utilized to remove the influence of protected features (e. g., age, sex, race) in chest radiograph embeddings, ensuring feature-independent results.

Baby's CoThought: Leveraging Large Language Models for Enhanced Reasoning in Compact Models

1 code implementation3 Aug 2023 Zheyu Zhang, Han Yang, Bolei Ma, David Rügamer, Ercong Nie

Large Language Models (LLMs) demonstrate remarkable performance on a variety of natural language understanding (NLU) tasks, primarily due to their in-context learning ability.

In-Context Learning Natural Language Understanding +1

Smoothing the Edges: A General Framework for Smooth Optimization in Sparse Regularization using Hadamard Overparametrization

no code implementations7 Jul 2023 Chris Kolb, Christian L. Müller, Bernd Bischl, David Rügamer

This is particularly useful in non-convex regularization, where finding global solutions is NP-hard and local minima often generalize well.

A New PHO-rmula for Improved Performance of Semi-Structured Networks

no code implementations1 Jun 2023 David Rügamer

Recent advances to combine structured regression models and deep neural networks for better interpretability, more expressiveness, and statistically valid uncertainty quantification demonstrate the versatility of semi-structured neural networks (SSNs).

Uncertainty Quantification valid

Constrained Probabilistic Mask Learning for Task-specific Undersampled MRI Reconstruction

1 code implementation25 May 2023 Tobias Weber, Michael Ingrisch, Bernd Bischl, David Rügamer

Undersampling is a common method in Magnetic Resonance Imaging (MRI) to subsample the number of data points in k-space, reducing acquisition times at the cost of decreased image quality.

MRI Reconstruction

Towards Efficient MCMC Sampling in Bayesian Neural Networks by Exploiting Symmetry

no code implementations6 Apr 2023 Jonas Gregor Wiese, Lisa Wimmer, Theodore Papamarkou, Bernd Bischl, Stephan Günnemann, David Rügamer

Bayesian inference in deep neural networks is challenging due to the high-dimensional, strongly multi-modal parameter posterior density landscape.

Bayesian Inference Uncertainty Quantification

Cascaded Latent Diffusion Models for High-Resolution Chest X-ray Synthesis

2 code implementations20 Mar 2023 Tobias Weber, Michael Ingrisch, Bernd Bischl, David Rügamer

While recent advances in large-scale foundational models show promising results, their application to the medical domain has not yet been explored in detail.

Vocal Bursts Intensity Prediction

Representation Learning for Tablet and Paper Domain Adaptation in Favor of Online Handwriting Recognition

no code implementations16 Jan 2023 Felix Ott, David Rügamer, Lucas Heublein, Bernd Bischl, Christopher Mutschler

The goal of domain adaptation (DA) is to mitigate this domain shift problem by searching for an optimal feature transformation to learn a domain-invariant representation.

Domain Adaptation Handwriting Recognition +1

Uncertainty-aware predictive modeling for fair data-driven decisions

no code implementations4 Nov 2022 Patrick Kaiser, Christoph Kern, David Rügamer

Both industry and academia have made considerable progress in developing trustworthy and responsible machine learning (ML) systems.

Fairness regression

What cleaves? Is proteasomal cleavage prediction reaching a ceiling?

1 code implementation24 Oct 2022 Ingo Ziegler, Bolei Ma, Ercong Nie, Bernd Bischl, David Rügamer, Benjamin Schubert, Emilio Dorigatti

While direct identification of proteasomal cleavage \emph{in vitro} is cumbersome and low throughput, it is possible to implicitly infer cleavage events from the termini of MHC-presented epitopes, which can be detected in large amounts thanks to recent advances in high-throughput MHC ligandomics.

Benchmarking Denoising

Privacy-Preserving and Lossless Distributed Estimation of High-Dimensional Generalized Additive Mixed Models

1 code implementation14 Oct 2022 Daniel Schalk, Bernd Bischl, David Rügamer

In this paper, we propose an algorithm for a distributed, privacy-preserving, and lossless estimation of generalized additive mixed models (GAMM) using component-wise gradient boosting (CWB).

feature selection Privacy Preserving

ARMA Cell: A Modular and Effective Approach for Neural Autoregressive Modeling

2 code implementations31 Aug 2022 Philipp Schiele, Christoph Berninger, David Rügamer

In this work, we introduce the ARMA cell, a simpler, modular, and effective approach for time series modeling in neural networks.

Time Series Time Series Analysis

Additive Higher-Order Factorization Machines

no code implementations28 May 2022 David Rügamer

One of the main limitations are missing interactions in these models, which are not included for the sake of better interpretability, but also due to untenable computational costs.

Interpretable Machine Learning

Domain Adaptation for Time-Series Classification to Mitigate Covariate Shift

1 code implementation7 Apr 2022 Felix Ott, David Rügamer, Lucas Heublein, Bernd Bischl, Christopher Mutschler

To mitigate this domain shift problem, domain adaptation (DA) techniques search for an optimal transformation that converts the (current) input data from a source domain to a target domain to learn a domain-invariant representation that reduces domain discrepancy.

Domain Adaptation Time Series +2

Auxiliary Cross-Modal Representation Learning with Triplet Loss Functions for Online Handwriting Recognition

no code implementations16 Feb 2022 Felix Ott, David Rügamer, Lucas Heublein, Bernd Bischl, Christopher Mutschler

We perform extensive evaluations on synthetic image and time-series data, and on data for offline handwriting recognition (HWR) and on online HWR from sensor-enhanced pens for classifying written words.

Classification Handwriting Recognition +6

Benchmarking Online Sequence-to-Sequence and Character-based Handwriting Recognition from IMU-Enhanced Pens

no code implementations14 Feb 2022 Felix Ott, David Rügamer, Lucas Heublein, Tim Hamann, Jens Barth, Bernd Bischl, Christopher Mutschler

While there exist many offline HWR datasets, there is only little data available for the development of OnHWR methods on paper as it requires hardware-integrated pens.

Benchmarking Handwriting Recognition +1

DeepPAMM: Deep Piecewise Exponential Additive Mixed Models for Complex Hazard Structures in Survival Analysis

no code implementations12 Feb 2022 Philipp Kopper, Simon Wiegrebe, Bernd Bischl, Andreas Bender, David Rügamer

Survival analysis (SA) is an active field of research that is concerned with time-to-event outcomes and is prevalent in many domains, particularly biomedical applications.

Survival Analysis

Survival-oriented embeddings for improving accessibility to complex data structures

no code implementations21 Oct 2021 Tobias Weber, Michael Ingrisch, Matthias Fabritius, Bernd Bischl, David Rügamer

We propose a hazard-regularized variational autoencoder that supports straightforward interpretation of deep neural architectures in the context of survival analysis, a field highly relevant in healthcare.

Decision Making Survival Analysis

Towards modelling hazard factors in unstructured data spaces using gradient-based latent interpolation

no code implementations21 Oct 2021 Tobias Weber, Michael Ingrisch, Bernd Bischl, David Rügamer

The application of deep learning in survival analysis (SA) allows utilizing unstructured and high-dimensional data types uncommon in traditional survival methods.

Survival Analysis

Accelerated Componentwise Gradient Boosting using Efficient Data Representation and Momentum-based Optimization

no code implementations7 Oct 2021 Daniel Schalk, Bernd Bischl, David Rügamer

Componentwise boosting (CWB), also known as model-based boosting, is a variant of gradient boosting that builds on additive models as base learners to ensure interpretability.

Additive models

Automatic Componentwise Boosting: An Interpretable AutoML System

no code implementations12 Sep 2021 Stefan Coors, Daniel Schalk, Bernd Bischl, David Rügamer

Despite its restriction to an interpretable model space, our system is competitive in terms of predictive performance on most data sets while being more user-friendly and transparent.

AutoML Feature Importance +2

Developing Open Source Educational Resources for Machine Learning and Data Science

no code implementations28 Jul 2021 Ludwig Bothmann, Sven Strickroth, Giuseppe Casalicchio, David Rügamer, Marius Lindauer, Fabian Scheipl, Bernd Bischl

It should be openly accessible to everyone, with as few barriers as possible; even more so for key technologies such as Machine Learning (ML) and Data Science (DS).

BIG-bench Machine Learning

deepregression: a Flexible Neural Network Framework for Semi-Structured Deep Distributional Regression

2 code implementations6 Apr 2021 David Rügamer, Chris Kolb, Cornelius Fritz, Florian Pfisterer, Philipp Kopper, Bernd Bischl, Ruolin Shen, Christina Bukas, Lisa Barros de Andrade e Sousa, Dominik Thalmeier, Philipp Baumann, Lucas Kook, Nadja Klein, Christian L. Müller

In this paper we describe the implementation of semi-structured deep distributional regression, a flexible framework to learn conditional distributions based on the combination of additive regression models and deep networks.

regression

Deep Semi-Supervised Learning for Time Series Classification

1 code implementation6 Feb 2021 Jann Goschenhofer, Rasmus Hvingelby, David Rügamer, Janek Thomas, Moritz Wagner, Bernd Bischl

Based on these adaptations, we explore the potential of deep semi-supervised learning in the context of time series classification by evaluating our methods on large public time series classification problems with varying amounts of labelled samples.

Classification Data Augmentation +4

Combining Graph Neural Networks and Spatio-temporal Disease Models to Predict COVID-19 Cases in Germany

no code implementations3 Jan 2021 Cornelius Fritz, Emilio Dorigatti, David Rügamer

The results corroborate the necessity of including mobility data and showcase the flexibility and interpretability of our approach.

BIG-bench Machine Learning

Semi-Structured Deep Piecewise Exponential Models

no code implementations11 Nov 2020 Philipp Kopper, Sebastian Pölsterl, Christian Wachinger, Bernd Bischl, Andreas Bender, David Rügamer

We propose a versatile framework for survival analysis that combines advanced concepts from statistics with deep learning.

Survival Analysis

Deep Conditional Transformation Models

no code implementations15 Oct 2020 Philipp F. M. Baumann, Torsten Hothorn, David Rügamer

Learning the cumulative distribution function (CDF) of an outcome variable conditional on a set of features remains challenging, especially in high-dimensional settings.

Neural Mixture Distributional Regression

no code implementations14 Oct 2020 David Rügamer, Florian Pfisterer, Bernd Bischl

We present neural mixture distributional regression (NMDR), a holistic framework to estimate complex finite mixtures of distributional regressions defined by flexible additive predictors.

regression

A General Machine Learning Framework for Survival Analysis

1 code implementation27 Jun 2020 Andreas Bender, David Rügamer, Fabian Scheipl, Bernd Bischl

The modeling of time-to-event data, also known as survival analysis, requires specialized methods that can deal with censoring and truncation, time-varying features and effects, and that extend to settings with multiple competing events.

BIG-bench Machine Learning Data Augmentation +1

Semi-Structured Distributional Regression -- Extending Structured Additive Models by Arbitrary Deep Neural Networks and Data Modalities

2 code implementations13 Feb 2020 David Rügamer, Chris Kolb, Nadja Klein

We propose a general framework to combine structured regression models and deep neural networks into a unifying network architecture.

Additive models regression

Inference for $L_2$-Boosting

1 code implementation4 May 2018 David Rügamer, Sonja Greven

We propose a statistical inference framework for the component-wise functional gradient descent algorithm (CFGD) under normality assumption for model errors, also known as $L_2$-Boosting.

Variable Selection

Boosting Functional Regression Models with FDboost

1 code implementation30 May 2017 Sarah Brockhaus, David Rügamer, Sonja Greven

In addition to mean regression, quantile regression models as well as generalized additive models for location scale and shape can be fitted with FDboost.

Computation

Cannot find the paper you are looking for? You can Submit a new open access paper.