1 code implementation • 27 May 2023 • Samuel Müller, Matthias Feurer, Noah Hollmann, Frank Hutter
In this paper, we use Prior-data Fitted Networks (PFNs) as a flexible surrogate for Bayesian Optimization (BO).
no code implementations • 16 Jul 2022 • Diane Wagner, Fabio Ferreira, Danny Stoll, Robin Tibor Schirrmeister, Samuel Müller, Frank Hutter
Self-Supervised Learning (SSL) has become a very active area of Deep Learning research where it is heavily used as a pre-training method for classification and other tasks.
6 code implementations • 5 Jul 2022 • Noah Hollmann, Samuel Müller, Katharina Eggensperger, Frank Hutter
We present TabPFN, a trained Transformer that can do supervised classification for small tabular datasets in less than a second, needs no hyperparameter tuning and is competitive with state-of-the-art classification methods.
1 code implementation • ICLR 2022 • Samuel Müller, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, Frank Hutter
Our method restates the objective of posterior approximation as a supervised classification problem with a set-valued input: it repeatedly draws a task (or function) from the prior, draws a set of data points and their labels from it, masks one of the labels and learns to make probabilistic predictions for it based on the set-valued input of the rest of the data points.
no code implementations • 5 Feb 2021 • Samuel Müller, André Biedenkapp, Frank Hutter
To do this, we optimize the loss of the next training step.
1 code implementation • 20 Oct 2019 • Samuel Müller, Andreas Vlachos
In this work, we adapt BPE for text-to-SQL generation.
1 code implementation • 5 Jun 2017 • Jochen Scheuer, Ilai Schwartz, Samuel Müller, Qiong Chen, Ish Dhand, Martin B. Plenio, Boris Naydenov, Fedor Jelezko
Here we demonstrate the polarization and read out of a nuclear spin bath consisting of $^{13}$C nuclear spins in diamond by using a single nitrogen-vacancy (NV) center.
Quantum Physics