Search Results for author: Samuel Müller

Found 11 papers, 8 papers with code

The Tabular Foundation Model TabPFN Outperforms Specialized Time Series Forecasting Models Based on Simple Features

1 code implementation6 Jan 2025 Shi Bin Hoo, Samuel Müller, David Salinas, Frank Hutter

Foundation models have become popular in forecasting due to their ability to make accurate predictions, even with minimal fine-tuning on specific datasets.

Feature Engineering Time Series +1

Bayes' Power for Explaining In-Context Learning Generalizations

1 code implementation2 Oct 2024 Samuel Müller, Noah Hollmann, Frank Hutter

In this paper, we argue that a more useful interpretation of neural network behavior in this era is as an approximation of the true posterior, as defined by the data-generating process.

In-Context Learning

PFNs4BO: In-Context Learning for Bayesian Optimization

1 code implementation27 May 2023 Samuel Müller, Matthias Feurer, Noah Hollmann, Frank Hutter

In this paper, we use Prior-data Fitted Networks (PFNs) as a flexible surrogate for Bayesian Optimization (BO).

Bayesian Optimization Hyperparameter Optimization +1

Large Language Models for Automated Data Science: Introducing CAAFE for Context-Aware Automated Feature Engineering

1 code implementation NeurIPS 2023 Noah Hollmann, Samuel Müller, Frank Hutter

Specifically, we introduce Context-Aware Automated Feature Engineering (CAAFE), a feature engineering method for tabular datasets that utilizes an LLM to iteratively generate additional semantically meaningful features for tabular datasets based on the description of the dataset.

Automated Feature Engineering Feature Engineering

On the Importance of Hyperparameters and Data Augmentation for Self-Supervised Learning

no code implementations16 Jul 2022 Diane Wagner, Fabio Ferreira, Danny Stoll, Robin Tibor Schirrmeister, Samuel Müller, Frank Hutter

Self-Supervised Learning (SSL) has become a very active area of Deep Learning research where it is heavily used as a pre-training method for classification and other tasks.

Bayesian Optimization Data Augmentation +2

TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second

6 code implementations5 Jul 2022 Noah Hollmann, Samuel Müller, Katharina Eggensperger, Frank Hutter

We present TabPFN, a trained Transformer that can do supervised classification for small tabular datasets in less than a second, needs no hyperparameter tuning and is competitive with state-of-the-art classification methods.

AutoML Bayesian Inference +5

Transformers Can Do Bayesian Inference

1 code implementation ICLR 2022 Samuel Müller, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, Frank Hutter

Our method restates the objective of posterior approximation as a supervised classification problem with a set-valued input: it repeatedly draws a task (or function) from the prior, draws a set of data points and their labels from it, masks one of the labels and learns to make probabilistic predictions for it based on the set-valued input of the rest of the data points.

AutoML Bayesian Inference +3

Robust techniques for polarization and detection of nuclear spin ensembles

1 code implementation5 Jun 2017 Jochen Scheuer, Ilai Schwartz, Samuel Müller, Qiong Chen, Ish Dhand, Martin B. Plenio, Boris Naydenov, Fedor Jelezko

Here we demonstrate the polarization and read out of a nuclear spin bath consisting of $^{13}$C nuclear spins in diamond by using a single nitrogen-vacancy (NV) center.

Quantum Physics

Cannot find the paper you are looking for? You can Submit a new open access paper.