Search Results for author: James W. Demmel

Found 4 papers, 2 papers with code

Surrogate-based Autotuning for Randomized Sketching Algorithms in Regression Problems

no code implementations30 Aug 2023 Younghyun Cho, James W. Demmel, Michał Dereziński, Haoyun Li, Hengrui Luo, Michael W. Mahoney, Riley J. Murray

Algorithms from Randomized Numerical Linear Algebra (RandNLA) are known to be effective in handling high-dimensional computational problems, providing high-quality empirical performance as well as strong probabilistic guarantees.

regression

Hybrid Parameter Search and Dynamic Model Selection for Mixed-Variable Bayesian Optimization

1 code implementation3 Jun 2022 Hengrui Luo, Younghyun Cho, James W. Demmel, Xiaoye S. Li, Yang Liu

This paper presents a new type of hybrid model for Bayesian optimization (BO) adept at managing mixed variables, encompassing both quantitative (continuous and integer) and qualitative (categorical) types.

Bayesian Optimization Gaussian Processes +2

Non-smooth Bayesian Optimization in Tuning Problems

1 code implementation15 Sep 2021 Hengrui Luo, James W. Demmel, Younghyun Cho, Xiaoye S. Li, Yang Liu

By using this surrogate model, we want to capture the non-smoothness of the black-box function.

Bayesian Optimization

Multitask and Transfer Learning for Autotuning Exascale Applications

no code implementations15 Aug 2019 Wissam M. Sid-Lakhdar, Mohsen Mahmoudi Aznaveh, Xiaoye S. Li, James W. Demmel

Multitask learning and transfer learning have proven to be useful in the field of machine learning when additional knowledge is available to help a prediction task.

Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.