Search Results for author: Juan Pablo García Amboage

Found 2 papers, 0 papers with code

Model Performance Prediction for Hyperparameter Optimization of Deep Learning Models Using High Performance Computing and Quantum Annealing

no code implementations29 Nov 2023 Juan Pablo García Amboage, Eric Wulff, Maria Girone, Tomás F. Pena

Hyperparameter Optimization (HPO) of Deep Learning-based models tends to be a compute resource intensive process as it usually requires to train the target model with many different hyperparameter configurations.

Hyperparameter Optimization

Hyperparameter optimization, quantum-assisted model performance prediction, and benchmarking of AI-based High Energy Physics workloads using HPC

no code implementations27 Mar 2023 Eric Wulff, Maria Girone, David Southwick, Juan Pablo García Amboage, Eduard Cuba

Training and Hyperparameter Optimization (HPO) of deep learning-based AI models are often compute resource intensive and calls for the use of large-scale distributed resources as well as scalable and resource efficient hyperparameter search algorithms.

Benchmarking Hyperparameter Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.