Search Results for author: Martin Binder

Found 9 papers, 3 papers with code

counterfactuals: An R Package for Counterfactual Explanation Methods

no code implementations13 Apr 2023 Susanne Dandl, Andreas Hofheinz, Martin Binder, Bernd Bischl, Giuseppe Casalicchio

Counterfactual explanation methods provide information on how feature values of individual observations must be changed to obtain a desired prediction.

counterfactual Counterfactual Explanation

Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers

1 code implementation29 Nov 2021 Julia Moosbauer, Martin Binder, Lennart Schneider, Florian Pfisterer, Marc Becker, Michel Lang, Lars Kotthoff, Bernd Bischl

Automated hyperparameter optimization (HPO) has gained great popularity and is an important ingredient of most automated machine learning frameworks.

Bayesian Optimization Hyperparameter Optimization

Self-GenomeNet: Self-supervised Learning with Reverse-Complement Context Prediction for Nucleotide-level Genomics Data

no code implementations29 Sep 2021 Hüseyin Anil Gündüz, Martin Binder, Xiao-Yin To, René Mreches, Philipp C. Münch, Alice C McHardy, Bernd Bischl, Mina Rezaei

We introduce Self-GenomeNet, a novel contrastive self-supervised learning method for nucleotide-level genomic data, which substantially improves the quality of the learned representations and performance compared to the current state-of-the-art deep learning frameworks.

Self-Supervised Learning

YAHPO Gym -- An Efficient Multi-Objective Multi-Fidelity Benchmark for Hyperparameter Optimization

1 code implementation8 Sep 2021 Florian Pfisterer, Lennart Schneider, Julia Moosbauer, Martin Binder, Bernd Bischl

When developing and analyzing new hyperparameter optimization methods, it is vital to empirically evaluate and compare them on well-curated benchmark suites.

Hyperparameter Optimization

Mutation is all you need

no code implementations ICML Workshop AutoML 2021 Lennart Schneider, Florian Pfisterer, Martin Binder, Bernd Bischl

Neural architecture search (NAS) promises to make deep learning accessible to non-experts by automating architecture engineering of deep neural networks.

Bayesian Optimization Neural Architecture Search

Multi-Objective Counterfactual Explanations

1 code implementation23 Apr 2020 Susanne Dandl, Christoph Molnar, Martin Binder, Bernd Bischl

We show the usefulness of MOC in concrete cases and compare our approach with state-of-the-art methods for counterfactual explanations.

counterfactual

Multi-Objective Hyperparameter Tuning and Feature Selection using Filter Ensembles

no code implementations30 Dec 2019 Martin Binder, Julia Moosbauer, Janek Thomas, Bernd Bischl

While model-based optimization needs fewer objective evaluations to achieve good performance, it incurs computational overhead compared to the NSGA-II, so the preferred choice depends on the cost of evaluating a model on given data.

feature selection Hyperparameter Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.