no code implementations • 23 Oct 2023 • Roman Hornung, Malte Nalenz, Lennart Schneider, Andreas Bender, Ludwig Bothmann, Bernd Bischl, Thomas Augustin, Anne-Laure Boulesteix
Our findings corroborate the concern that standard resampling methods often yield biased GE estimates in non-standard settings, underscoring the importance of tailored GE estimation.
no code implementations • 17 Jul 2023 • Lennart Purucker, Lennart Schneider, Marie Anastacio, Joeran Beel, Bernd Bischl, Holger Hoos
Automated machine learning (AutoML) systems commonly ensemble models post hoc to improve predictive performance, typically via greedy ensemble selection (GES).
1 code implementation • 17 Jul 2023 • Lennart Schneider, Bernd Bischl, Janek Thomas
Efficient optimization is achieved via augmentation of the search space of the learning algorithm by incorporating feature selection, interaction and monotonicity constraints into the hyperparameter search space.
1 code implementation • 30 Jul 2022 • Lennart Schneider, Lennart Schäpermeier, Raphael Patrick Prager, Bernd Bischl, Heike Trautmann, Pascal Kerschke
We identify a subset of BBOB problems that are close to the HPO problems in ELA feature space and show that optimizer performance is comparably similar on these two sets of benchmark problems.
1 code implementation • 30 Jul 2022 • Lennart Schneider, Florian Pfisterer, Paul Kent, Juergen Branke, Bernd Bischl, Janek Thomas
Although considerable progress has been made in the field of multi-objective NAS, we argue that there is some discrepancy between the actual optimization problem of practical interest and the optimization problem that multi-objective NAS tries to solve.
no code implementations • 15 Jun 2022 • Florian Karl, Tobias Pielok, Julia Moosbauer, Florian Pfisterer, Stefan Coors, Martin Binder, Lennart Schneider, Janek Thomas, Jakob Richter, Michel Lang, Eduardo C. Garrido-Merchán, Juergen Branke, Bernd Bischl
Hyperparameter optimization constitutes a large part of typical modern machine learning workflows.
1 code implementation • 28 Apr 2022 • Lennart Schneider, Florian Pfisterer, Janek Thomas, Bernd Bischl
The goal of Quality Diversity Optimization is to generate a collection of diverse yet high-performing solutions to a given problem at hand.
1 code implementation • 29 Nov 2021 • Julia Moosbauer, Martin Binder, Lennart Schneider, Florian Pfisterer, Marc Becker, Michel Lang, Lars Kotthoff, Bernd Bischl
Automated hyperparameter optimization (HPO) has gained great popularity and is an important ingredient of most automated machine learning frameworks.
1 code implementation • 8 Sep 2021 • Florian Pfisterer, Lennart Schneider, Julia Moosbauer, Martin Binder, Bernd Bischl
When developing and analyzing new hyperparameter optimization methods, it is vital to empirically evaluate and compare them on well-curated benchmark suites.
no code implementations • ICML Workshop AutoML 2021 • Lennart Schneider, Florian Pfisterer, Martin Binder, Bernd Bischl
Neural architecture search (NAS) promises to make deep learning accessible to non-experts by automating architecture engineering of deep neural networks.
1 code implementation • 10 Oct 2018 • Lennart Schneider, R. Philip Chalmers, Rudolf Debelak, Edgar C. Merkle
Vuong's approach of model selection is useful because it allows for formal statistical tests of both nested and non-nested models.
Applications