Search Results for author: Julia Moosbauer

Found 10 papers, 4 papers with code

Position Paper: Bridging the Gap Between Machine Learning and Sensitivity Analysis

no code implementations20 Dec 2023 Christian A. Scholbeck, Julia Moosbauer, Giuseppe Casalicchio, Hoshin Gupta, Bernd Bischl, Christian Heumann

We argue that interpretations of machine learning (ML) models or the model-building process can bee seen as a form of sensitivity analysis (SA), a general methodology used to explain complex systems in many fields such as environmental modeling, engineering, or economics.

Position

RAISE -- Radiology AI Safety, an End-to-end lifecycle approach

no code implementations24 Nov 2023 M. Jorge Cardoso, Julia Moosbauer, Tessa S. Cook, B. Selnur Erdal, Brad Genereaux, Vikash Gupta, Bennett A. Landman, Tiarna Lee, Parashkev Nachev, Elanchezhian Somasundaram, Ronald M. Summers, Khaled Younis, Sebastien Ourselin, Franz MJ Pfister

The integration of AI into radiology introduces opportunities for improved clinical care provision and efficiency but it demands a meticulous approach to mitigate potential risks as with any other new technology.

Fairness Scheduling

Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers

1 code implementation29 Nov 2021 Julia Moosbauer, Martin Binder, Lennart Schneider, Florian Pfisterer, Marc Becker, Michel Lang, Lars Kotthoff, Bernd Bischl

Automated hyperparameter optimization (HPO) has gained great popularity and is an important ingredient of most automated machine learning frameworks.

Bayesian Optimization Hyperparameter Optimization

YAHPO Gym -- An Efficient Multi-Objective Multi-Fidelity Benchmark for Hyperparameter Optimization

1 code implementation8 Sep 2021 Florian Pfisterer, Lennart Schneider, Julia Moosbauer, Martin Binder, Bernd Bischl

When developing and analyzing new hyperparameter optimization methods, it is vital to empirically evaluate and compare them on well-curated benchmark suites.

Hyperparameter Optimization

Multi-Objective Hyperparameter Tuning and Feature Selection using Filter Ensembles

no code implementations30 Dec 2019 Martin Binder, Julia Moosbauer, Janek Thomas, Bernd Bischl

While model-based optimization needs fewer objective evaluations to achieve good performance, it incurs computational overhead compared to the NSGA-II, so the preferred choice depends on the cost of evaluating a model on given data.

feature selection Hyperparameter Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.