1 code implementation • 5 Feb 2024 • Lam Ngo, Huong Ha, Jeffrey Chan, Vu Nguyen, Hongyu Zhang
To address this issue, a promising solution is to use a local search strategy that partitions the search domain into local regions with high likelihood of containing the global optimum, and then use BO to optimize the objective function within these regions.
no code implementations • 12 Jun 2023 • Huong Ha, Vu Nguyen, Hongyu Zhang, Anton Van Den Hengel
Our method uses a multi-armed bandit technique (EXP3) to add random data points to the BO process, and employs a novel training loss function for the GP hyperparameter estimation process that ensures unbiased estimation from the observed data.
no code implementations • 27 Dec 2022 • Huong Ha, Zongwen Fan, Hongyu Zhang
We also develop a novel uncertainty calibration technique to ensure the reliability of the confidence intervals generated by a Bayesian prediction model.
no code implementations • 16 Dec 2022 • Huong Ha
Particularly importance is the problem of monitoring the performance of machine learning systems across all the data subgroups (subpopulations).
no code implementations • 11 Apr 2021 • Huong Ha, Sunil Gupta, Santu Rana, Svetha Venkatesh
Machine learning models are being used extensively in many important areas, but there is no guarantee a model will always perform well or as its developers intended.
1 code implementation • 14 Feb 2021 • Xingchen Wan, Vu Nguyen, Huong Ha, Binxin Ru, Cong Lu, Michael A. Osborne
High-dimensional black-box optimisation remains an important yet notoriously challenging problem.
1 code implementation • 17 Dec 2020 • Huong Ha, Sunil Gupta, Santu Rana, Svetha Venkatesh
In particular, we consider two types of LSE problems: (1) \textit{explicit} LSE problem where the threshold level is a fixed user-specified value, and, (2) \textit{implicit} LSE problem where the threshold level is defined as a percentage of the (unknown) maximum of the objective function.
no code implementations • NeurIPS 2020 • Hung Tran-The, Sunil Gupta, Santu Rana, Huong Ha, Svetha Venkatesh
To this end, we propose a novel BO algorithm which expands (and shifts) the search space over iterations based on controlling the expansion rate thought a hyperharmonic series.
1 code implementation • 19 Jan 2020 • Thanh Tang Nguyen, Sunil Gupta, Huong Ha, Santu Rana, Svetha Venkatesh
We adopt the distributionally robust optimization perspective to this problem by maximizing the expected objective under the most adversarial distribution.
1 code implementation • NeurIPS 2019 • Huong Ha, Santu Rana, Sunil Gupta, Thanh Nguyen, Hung Tran-The, Svetha Venkatesh
Applying Bayesian optimization in problems wherein the search space is unknown is challenging.