no code implementations • 5 Jun 2024 • Marius Lindauer, Florian Karl, Anne Klier, Julia Moosbauer, Alexander Tornede, Andreas Mueller, Frank Hutter, Matthias Feurer, Bernd Bischl
Automated machine learning (AutoML) was formed around the fundamental objectives of automatically and efficiently configuring machine learning (ML) workflows, aiding the research of new ML algorithms, and contributing to the democratization of ML by making it accessible to a broader audience.
no code implementations • 20 Dec 2023 • Christian A. Scholbeck, Julia Moosbauer, Giuseppe Casalicchio, Hoshin Gupta, Bernd Bischl, Christian Heumann
We argue that interpretations of machine learning (ML) models or the model-building process can be seen as a form of sensitivity analysis (SA), a general methodology used to explain complex systems in many fields such as environmental modeling, engineering, or economics.
no code implementations • 24 Nov 2023 • M. Jorge Cardoso, Julia Moosbauer, Tessa S. Cook, B. Selnur Erdal, Brad Genereaux, Vikash Gupta, Bennett A. Landman, Tiarna Lee, Parashkev Nachev, Elanchezhian Somasundaram, Ronald M. Summers, Khaled Younis, Sebastien Ourselin, Franz MJ Pfister
The integration of AI into radiology introduces opportunities for improved clinical care provision and efficiency but it demands a meticulous approach to mitigate potential risks as with any other new technology.
no code implementations • 15 Jun 2022 • Florian Karl, Tobias Pielok, Julia Moosbauer, Florian Pfisterer, Stefan Coors, Martin Binder, Lennart Schneider, Janek Thomas, Jakob Richter, Michel Lang, Eduardo C. Garrido-Merchán, Juergen Branke, Bernd Bischl
Hyperparameter optimization constitutes a large part of typical modern machine learning workflows.
1 code implementation • 11 Jun 2022 • Julia Moosbauer, Giuseppe Casalicchio, Marius Lindauer, Bernd Bischl
Despite all the benefits of automated hyperparameter optimization (HPO), most modern HPO algorithms are black-boxes themselves.
1 code implementation • 29 Nov 2021 • Julia Moosbauer, Martin Binder, Lennart Schneider, Florian Pfisterer, Marc Becker, Michel Lang, Lars Kotthoff, Bernd Bischl
Automated hyperparameter optimization (HPO) has gained great popularity and is an important ingredient of most automated machine learning frameworks.
1 code implementation • NeurIPS 2021 • Julia Moosbauer, Julia Herbinger, Giuseppe Casalicchio, Marius Lindauer, Bernd Bischl
Automated hyperparameter optimization (HPO) can support practitioners to obtain peak performance in machine learning models.
1 code implementation • 8 Sep 2021 • Florian Pfisterer, Lennart Schneider, Julia Moosbauer, Martin Binder, Bernd Bischl
When developing and analyzing new hyperparameter optimization methods, it is vital to empirically evaluate and compare them on well-curated benchmark suites.
no code implementations • ICML Workshop AutoML 2021 • Julia Moosbauer, Julia Herbinger, Giuseppe Casalicchio, Marius Lindauer, Bernd Bischl
Automated hyperparameter optimization (HPO) can support practitioners to obtain peak performance in machine learning models.
no code implementations • 18 Mar 2020 • Abhijeet Parida, Aadhithya Sankar, Rami Eisawy, Tom Finck, Benedikt Wiestler, Franz Pfister, Julia Moosbauer
High-quality labeled data is essential to successfully train supervised machine learning models.
no code implementations • 30 Dec 2019 • Martin Binder, Julia Moosbauer, Janek Thomas, Bernd Bischl
While model-based optimization needs fewer objective evaluations to achieve good performance, it incurs computational overhead compared to the NSGA-II, so the preferred choice depends on the cost of evaluating a model on given data.