Search Results for author: Haoxian Chen

Found 7 papers, 2 papers with code

Pseudo-Bayesian Optimization

no code implementations15 Oct 2023 Haoxian Chen, Henry Lam

Its key idea is to use a surrogate model to approximate the objective and, importantly, quantify the associated uncertainty that allows a sequential search of query points that balance exploitation-exploration.

Bayesian Optimization Uncertainty Quantification

Hybrid Random Features

1 code implementation ICLR 2022 Krzysztof Choromanski, Haoxian Chen, Han Lin, Yuanzhe Ma, Arijit Sehanobish, Deepali Jain, Michael S Ryoo, Jake Varley, Andy Zeng, Valerii Likhosherstov, Dmitry Kalashnikov, Vikas Sindhwani, Adrian Weller

We propose a new class of random feature methods for linearizing softmax and Gaussian kernels called hybrid random features (HRFs) that automatically adapt the quality of kernel estimation to provide most accurate approximation in the defined regions of interest.

Benchmarking

From block-Toeplitz matrices to differential equations on graphs: towards a general theory for scalable masked Transformers

1 code implementation16 Jul 2021 Krzysztof Choromanski, Han Lin, Haoxian Chen, Tianyi Zhang, Arijit Sehanobish, Valerii Likhosherstov, Jack Parker-Holder, Tamas Sarlos, Adrian Weller, Thomas Weingarten

In this paper we provide, to the best of our knowledge, the first comprehensive approach for incorporating various masking mechanisms into Transformers architectures in a scalable way.

Graph Attention

Calibrating Over-Parametrized Simulation Models: A Framework via Eligibility Set

no code implementations27 May 2021 Yuanlu Bai, Tucker Balch, Haoxian Chen, Danial Dervovic, Henry Lam, Svitlana Vyetrenko

Stochastic simulation aims to compute output performance for complex models that lack analytical tractability.

Interpret-able feedback for AutoML systems

no code implementations22 Feb 2021 Behnaz Arzani, Kevin Hsieh, Haoxian Chen

Automated machine learning (AutoML) systems aim to enable training machine learning (ML) models for non-ML experts.

Active Learning AutoML +1

Demystifying Orthogonal Monte Carlo and Beyond

no code implementations NeurIPS 2020 Han Lin, Haoxian Chen, Tianyi Zhang, Clement Laroche, Krzysztof Choromanski

Orthogonal Monte Carlo (OMC) is a very effective sampling algorithm imposing structural geometric conditions (orthogonality) on samples for variance reduction.

Cannot find the paper you are looking for? You can Submit a new open access paper.