Search Results for author: Haralampos Pozidis

Found 7 papers, 1 papers with code

Search-based Methods for Multi-Cloud Configuration

no code implementations20 Apr 2022 Małgorzata Łazuka, Thomas Parnell, Andreea Anghel, Haralampos Pozidis

Our experiments indicate that (a) many state-of-the-art cloud configuration solutions can be adapted to multi-cloud, with best results obtained for adaptations which utilize the hierarchical structure of the multi-cloud configuration domain, (b) hierarchical methods from AutoML can be used for the multi-cloud configuration task and can outperform state-of-the-art cloud configuration solutions and (c) CB achieves competitive or lower regret relative to other tested algorithms, whilst also identifying configurations that have 65% lower median cost and 20% lower median time in production, compared to choosing a random provider and configuration.

AutoML Cloud Computing

What can multi-cloud configuration learn from AutoML?

no code implementations29 Sep 2021 Malgorzata Lazuka, Thomas Parnell, Andreea Anghel, Haralampos Pozidis

Multi-cloud computing has become increasingly popular with enterprises looking to avoid vendor lock-in.

AutoML Cloud Computing

SnapBoost: A Heterogeneous Boosting Machine

2 code implementations NeurIPS 2020 Thomas Parnell, Andreea Anghel, Malgorzata Lazuka, Nikolas Ioannou, Sebastian Kurella, Peshal Agarwal, Nikolaos Papandreou, Haralampos Pozidis

At each boosting iteration, their goal is to find the base hypothesis, selected from some base hypothesis class, that is closest to the Newton descent direction in a Euclidean sense.

Sampling Acquisition Functions for Batch Bayesian Optimization

no code implementations22 Mar 2019 Alessandro De Palma, Celestine Mendler-Dünner, Thomas Parnell, Andreea Anghel, Haralampos Pozidis

We present Acquisition Thompson Sampling (ATS), a novel technique for batch Bayesian Optimization (BO) based on the idea of sampling multiple acquisition functions from a stochastic process.

Bayesian Optimization Thompson Sampling

Benchmarking and Optimization of Gradient Boosting Decision Tree Algorithms

no code implementations12 Sep 2018 Andreea Anghel, Nikolaos Papandreou, Thomas Parnell, Alessandro De Palma, Haralampos Pozidis

Gradient boosting decision trees (GBDTs) have seen widespread adoption in academia, industry and competitive data science due to their state-of-the-art performance in many machine learning tasks.

Bayesian Optimization Benchmarking

Understanding and Optimizing the Performance of Distributed Machine Learning Applications on Apache Spark

no code implementations5 Dec 2016 Celestine Dünner, Thomas Parnell, Kubilay Atasu, Manolis Sifalakis, Haralampos Pozidis

We begin by analyzing the characteristics of a state-of-the-art distributed machine learning algorithm implemented in Spark and compare it to an equivalent reference implementation using the high performance computing framework MPI.

BIG-bench Machine Learning Computational Efficiency

Cannot find the paper you are looking for? You can Submit a new open access paper.