1 code implementation • 27 Mar 2024 • Andreas Müller, Erwin Quiring
We empirically examine our findings in a comprehensive evaluation with multiple image classification models and show that our attack achieves the same sparsity effect as prior sponge-example methods, but at a fraction of computation effort.
no code implementations • 14 Dec 2023 • Andreas Müller, Carlo Curino, Raghu Ramakrishnan
In contrast to existing hypernetworks that were either task-specific or trained for relatively constraint multi-task settings, MotherNet is trained to generate networks to perform multiclass classification on arbitrary tabular datasets without any dataset specific gradient descent.
1 code implementation • 23 Oct 2023 • Erwin Quiring, Andreas Müller, Konrad Rieck
Unfortunately, this preprocessing step is vulnerable to so-called image-scaling attacks where an attacker makes unnoticeable changes to an image so that it becomes a new image after scaling.
no code implementations • 7 Aug 2023 • Marcel Moravek, Alexander Zender, Andreas Müller
For our studies a pre-trained BERT model was used and fine-tuned utilising different datasets and training methods to identify the searched context.
1 code implementation • 7 Mar 2023 • Andreas Müller, Deborah Schmidt, Lucas Rieckert, Michele Solimena, Martin Weigert
In this protocol, we describe a practical and annotation-efficient pipeline for organelle-specific segmentation, spatial analysis, and visualization of large volume electron microscopy datasets using freely available, user-friendly software tools that can be run on a single standard workstation.
1 code implementation • 6 Nov 2019 • Matthias Feurer, Jan N. van Rijn, Arlind Kadra, Pieter Gijsbers, Neeratyoy Mallik, Sahithya Ravi, Andreas Müller, Joaquin Vanschoren, Frank Hutter
It also provides functionality to conduct machine learning experiments, upload the results to OpenML, and reproduce results which are stored on OpenML.
no code implementations • 23 Nov 2018 • Florian Pfisterer, Jan N. van Rijn, Philipp Probst, Andreas Müller, Bernd Bischl
The performance of modern machine learning methods highly depends on their hyperparameter configurations.
3 code implementations • 2 Jan 2012 • Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, Andreas Müller, Joel Nothman, Gilles Louppe, Peter Prettenhofer, Ron Weiss, Vincent Dubourg, Jake Vanderplas, Alexandre Passos, David Cournapeau, Matthieu Brucher, Matthieu Perrot, Édouard Duchesnay
Scikit-learn is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems.