Search Results for author: Andrés Camero

Found 12 papers, 4 papers with code

Landscape of Neural Architecture Search across sensors: how much do they differ ?

no code implementations17 Jan 2022 Kalifou René Traoré, Andrés Camero, Xiao Xiang Zhu

With the rapid rise of neural architecture search, the ability to understand its complexity from the perspective of a search algorithm is desirable.

Image Classification Neural Architecture Search

A Data-driven Approach to Neural Architecture Search Initialization

1 code implementation5 Nov 2021 Kalifou René Traoré, Andrés Camero, Xiao Xiang Zhu

First, we perform a calibrated clustering analysis of the search space, and second, we extract the centroids and use them to initialize a NAS algorithm.

Neural Architecture Search

Lessons from the Clustering Analysis of a Search Space: A Centroid-based Approach to Initializing NAS

no code implementations20 Aug 2021 Kalifou Rene Traore, Andrés Camero, Xiao Xiang Zhu

In this study, we propose to accelerate a NAS algorithm using a data-driven initialization technique, leveraging the availability of NAS benchmarks.

Neural Architecture Search

Reliable and Fast Recurrent Neural Network Architecture Optimization

no code implementations29 Jun 2021 Andrés Camero, Jamal Toutouh, Enrique Alba

This article introduces Random Error Sampling-based Neuroevolution (RESN), a novel automatic method to optimize recurrent neural network architectures.

JSDoop and TensorFlow.js: Volunteer Distributed Web Browser-Based Neural Network Training

no code implementations12 Oct 2019 José Á. Morell, Andrés Camero, Enrique Alba

Then, volunteers access the web page of the problem and start processing the tasks in their web browsers.

Random Error Sampling-based Recurrent Neural Network Architecture Optimization

1 code implementation4 Sep 2019 Andrés Camero, Jamal Toutouh, Enrique Alba

Our findings show that we can achieve state-of-the-art error performance and that we reduce by half the time needed to perform the optimization.

Hyperparameter Optimization

DLOPT: Deep Learning Optimization Library

1 code implementation10 Jul 2018 Andrés Camero, Jamal Toutouh, Enrique Alba

Deep learning hyper-parameter optimization is a tough task.

Low-Cost Recurrent Neural Network Expected Performance Evaluation

no code implementations18 May 2018 Andrés Camero, Jamal Toutouh, Enrique Alba

In this study, we propose a low computational cost model to evaluate the expected performance of a given architecture based on the distribution of the error of random samples of the weights.

Cannot find the paper you are looking for? You can Submit a new open access paper.