1 code implementation • 25 Oct 2024 • Sebastian Pineda Arango, Maciej Janowski, Lennart Purucker, Arber Zela, Frank Hutter, Josif Grabocka
Finetuning is a common practice widespread across different communities to adapt pretrained models to particular tasks.
1 code implementation • 6 Oct 2024 • Sebastian Pineda Arango, Maciej Janowski, Lennart Purucker, Arber Zela, Frank Hutter, Josif Grabocka
In this study, we explore employing neural networks as ensemble methods, emphasizing the significance of dynamic ensembling to leverage diverse model predictions adaptively.
5 code implementations • 12 Mar 2024 • Abdul Fatir Ansari, Lorenzo Stella, Caner Turkmen, Xiyuan Zhang, Pedro Mercado, Huibin Shen, Oleksandr Shchur, Syama Sundar Rangapuram, Sebastian Pineda Arango, Shubham Kapoor, Jasper Zschiegner, Danielle C. Maddix, Hao Wang, Michael W. Mahoney, Kari Torkkola, Andrew Gordon Wilson, Michael Bohlke-Schneider, Yuyang Wang
We introduce Chronos, a simple yet effective framework for pretrained probabilistic time series models.
no code implementations • 6 Jun 2023 • Sebastian Pineda Arango, Fabio Ferreira, Arlind Kadra, Frank Hutter, Josif Grabocka
With the ever-increasing number of pretrained models, machine learning practitioners are continuously faced with which pretrained model to use, and how to finetune it for a new dataset.
1 code implementation • 23 May 2023 • Sebastian Pineda Arango, Josif Grabocka
As a remedy, this paper proposes a novel neural architecture that captures the deep interaction between the components of a Machine Learning pipeline.
Automatic Machine Learning Model Selection
Bayesian Optimization
+2
1 code implementation • 22 May 2023 • Arlind Kadra, Sebastian Pineda Arango, Josif Grabocka
Even though neural networks have been long deployed in applications involving tabular data, still existing neural architectures are not explainable by design.
no code implementations • 27 Mar 2023 • Abdus Salam Khazi, Sebastian Pineda Arango, Josif Grabocka
Automatically optimizing the hyperparameters of Machine Learning algorithms is one of the primary open questions in AI.
1 code implementation • ICLR 2022 • Samuel Müller, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, Frank Hutter
Our method restates the objective of posterior approximation as a supervised classification problem with a set-valued input: it repeatedly draws a task (or function) from the prior, draws a set of data points and their labels from it, masks one of the labels and learns to make probabilistic predictions for it based on the set-valued input of the rest of the data points.
no code implementations • 29 Sep 2021 • Hadi Samer Jomaa, Sebastian Pineda Arango, Lars Schmidt-Thieme, Josif Grabocka
As a result, our novel DKLM can learn contextualized dataset-specific similarity representations for hyperparameter configurations.
no code implementations • 5 Aug 2021 • Sebastian Pineda Arango, Felix Heinrich, Kiran Madhusudhanan, Lars Schmidt-Thieme
Recent work has shown the efficiency of deep learning models such as Fully Convolutional Networks (FCN) or Recurrent Neural Networks (RNN) to deal with Time Series Regression (TSR) problems.
1 code implementation • 11 Jun 2021 • Sebastian Pineda Arango, Hadi S. Jomaa, Martin Wistuba, Josif Grabocka
Hyperparameter optimization (HPO) is a core problem for the machine learning community and remains largely unsolved due to the significant computational resources required to evaluate hyperparameter configurations.