Search Results for author: Daniel Schalk

Found 5 papers, 2 papers with code

Multimodal Deep Learning

1 code implementation12 Jan 2023 Cem Akkus, Luyang Chu, Vladana Djakovic, Steffen Jauch-Walser, Philipp Koch, Giacomo Loss, Christopher Marquardt, Marco Moldovan, Nadja Sauter, Maximilian Schneider, Rickmer Schulte, Karol Urbanczyk, Jann Goschenhofer, Christian Heumann, Rasmus Hvingelby, Daniel Schalk, Matthias Aßenmacher

This book is the result of a seminar in which we reviewed multimodal approaches and attempted to create a solid overview of the field, starting with the current state-of-the-art approaches in the two subfields of Deep Learning individually.

Multimodal Deep Learning Representation Learning

Privacy-Preserving and Lossless Distributed Estimation of High-Dimensional Generalized Additive Mixed Models

1 code implementation14 Oct 2022 Daniel Schalk, Bernd Bischl, David Rügamer

In this paper, we propose an algorithm for a distributed, privacy-preserving, and lossless estimation of generalized additive mixed models (GAMM) using component-wise gradient boosting (CWB).

feature selection Privacy Preserving

Accelerated Componentwise Gradient Boosting using Efficient Data Representation and Momentum-based Optimization

no code implementations7 Oct 2021 Daniel Schalk, Bernd Bischl, David Rügamer

Componentwise boosting (CWB), also known as model-based boosting, is a variant of gradient boosting that builds on additive models as base learners to ensure interpretability.

Additive models

Automatic Componentwise Boosting: An Interpretable AutoML System

no code implementations12 Sep 2021 Stefan Coors, Daniel Schalk, Bernd Bischl, David Rügamer

Despite its restriction to an interpretable model space, our system is competitive in terms of predictive performance on most data sets while being more user-friendly and transparent.

AutoML Feature Importance +2

Cannot find the paper you are looking for? You can Submit a new open access paper.