no code implementations • LREC 2022 • Jakob Lesage, Hannah J. Haynie, Hedvig Skirgård, Tobias Weber, Alena Witzlack-Makarevich
We then aggregate these comments and the coded values to derive a level of description for 17 grammatical domains that Grambank covers (negation, adnominal modification, participant marking, tense, aspect, etc.).
no code implementations • 4 Feb 2025 • Chris Kolb, Tobias Weber, Bernd Bischl, David Rügamer
Sparse regularization techniques are well-established in machine learning, yet their application in neural networks remains challenging due to the non-differentiability of penalties like the $L_1$ norm, which is incompatible with stochastic gradient descent.
no code implementations • 7 Jun 2024 • Emilia Magnani, Marvin Pförtner, Tobias Weber, Philipp Hennig
We introduce LUNO, a novel framework for approximate Bayesian uncertainty quantification in trained neural operators.
1 code implementation • 3 May 2024 • David Rügamer, Chris Kolb, Tobias Weber, Lucas Kook, Thomas Nagler
The complexity of black-box algorithms can lead to various challenges, including the introduction of biases.
1 code implementation • 15 Apr 2024 • Tobias Weber, Jakob Dexl, David Rügamer, Michael Ingrisch
The application of Tucker decomposition to the TS model substantially reduced the model parameters and FLOPs across various compression rates, with limited loss in segmentation accuracy.
1 code implementation • 2 Nov 2023 • Tobias Weber, Michael Ingrisch, Bernd Bischl, David Rügamer
Methods: An orthogonalization is utilized to remove the influence of protected features (e. g., age, sex, race) in CXR embeddings, ensuring feature-independent results.
1 code implementation • 27 Oct 2023 • Fiete Lüer, Tobias Weber, Maxim Dolgich, Christian Böhm
Anomaly detection in imbalanced datasets is a frequent and crucial problem, especially in the medical domain where retrieving and labeling irregularities is often expensive.
1 code implementation • 25 May 2023 • Tobias Weber, Michael Ingrisch, Bernd Bischl, David Rügamer
Undersampling is a common method in Magnetic Resonance Imaging (MRI) to subsample the number of data points in k-space, reducing acquisition times at the cost of decreased image quality.
2 code implementations • 28 Mar 2023 • Ludwig Bothmann, Lisa Wimmer, Omid Charrakh, Tobias Weber, Hendrik Edelhoff, Wibke Peters, Hien Nguyen, Caryl Benjamin, Annette Menzel
(2) We provide an active learning (AL) system that allows training deep learning models very efficiently in terms of required human-labeled training images.
2 code implementations • 20 Mar 2023 • Tobias Weber, Michael Ingrisch, Bernd Bischl, David Rügamer
While recent advances in large-scale foundational models show promising results, their application to the medical domain has not yet been explored in detail.
no code implementations • 30 Dec 2022 • Katharina Jeblick, Balthasar Schachtner, Jakob Dexl, Andreas Mittermeier, Anna Theresa Stüber, Johanna Topalis, Tobias Weber, Philipp Wesp, Bastian Sabel, Jens Ricke, Michael Ingrisch
In a questionnaire, we asked 15 radiologists to assess the quality of radiology reports simplified by ChatGPT.
no code implementations • 21 Oct 2021 • Tobias Weber, Michael Ingrisch, Bernd Bischl, David Rügamer
The application of deep learning in survival analysis (SA) allows utilizing unstructured and high-dimensional data types uncommon in traditional survival methods.
no code implementations • 21 Oct 2021 • Tobias Weber, Michael Ingrisch, Matthias Fabritius, Bernd Bischl, David Rügamer
We propose a hazard-regularized variational autoencoder that supports straightforward interpretation of deep neural architectures in the context of survival analysis, a field highly relevant in healthcare.
no code implementations • 16 Oct 2019 • Tobias Weber, Dieter Kranzlmüller, Michael Fromm, Nelson Tavares de Sousa
Both applications perform at scale with the proposed models which are available for re-use.