no code implementations • 4 Aug 2022 • Kalifou René Traoré, Andrés Camero, Xiao Xiang Zhu
Hyperparameter optimization (HPO) is a well-studied research field.
no code implementations • CVPR 2022 • Aysim Toker, Lukas Kondmann, Mark Weber, Marvin Eisenberger, Andrés Camero, Jingliang Hu, Ariadna Pregel Hoderlein, Çağlar Şenaras, Timothy Davis, Daniel Cremers, Giovanni Marchisio, Xiao Xiang Zhu, Laura Leal-Taixé
These observations are paired with pixel-wise monthly semantic segmentation labels of 7 land use and land cover (LULC) classes.
no code implementations • 17 Jan 2022 • Kalifou René Traoré, Andrés Camero, Xiao Xiang Zhu
With the rapid rise of neural architecture search, the ability to understand its complexity from the perspective of a search algorithm is desirable.
1 code implementation • 5 Nov 2021 • Kalifou René Traoré, Andrés Camero, Xiao Xiang Zhu
First, we perform a calibrated clustering analysis of the search space, and second, we extract the centroids and use them to initialize a NAS algorithm.
no code implementations • 2 Nov 2021 • Kalifou René Traoré, Andrés Camero, Xiao Xiang Zhu
In this paper, we propose to use fitness landscape analysis to study a neural architecture search problem.
no code implementations • 20 Aug 2021 • Kalifou Rene Traore, Andrés Camero, Xiao Xiang Zhu
In this study, we propose to accelerate a NAS algorithm using a data-driven initialization technique, leveraging the availability of NAS benchmarks.
no code implementations • 29 Jun 2021 • Andrés Camero, Jamal Toutouh, Enrique Alba
This article introduces Random Error Sampling-based Neuroevolution (RESN), a novel automatic method to optimize recurrent neural network architectures.
1 code implementation • 29 Jan 2020 • Andrés Camero, Hao Wang, Enrique Alba, Thomas Bäck
Recurrent neural networks (RNNs) are a powerful approach for time series prediction.
no code implementations • 12 Oct 2019 • José Á. Morell, Andrés Camero, Enrique Alba
Then, volunteers access the web page of the problem and start processing the tasks in their web browsers.
1 code implementation • 4 Sep 2019 • Andrés Camero, Jamal Toutouh, Enrique Alba
Our findings show that we can achieve state-of-the-art error performance and that we reduce by half the time needed to perform the optimization.
1 code implementation • 10 Jul 2018 • Andrés Camero, Jamal Toutouh, Enrique Alba
Deep learning hyper-parameter optimization is a tough task.
no code implementations • 18 May 2018 • Andrés Camero, Jamal Toutouh, Enrique Alba
In this study, we propose a low computational cost model to evaluate the expected performance of a given architecture based on the distribution of the error of random samples of the weights.