Search Results for author: Xingchen Ma

Found 6 papers, 4 papers with code

A Corrected Expected Improvement Acquisition Function Under Noisy Observations

1 code implementation8 Oct 2023 Han Zhou, Xingchen Ma, Matthew B Blaschko

Sequential maximization of expected improvement (EI) is one of the most widely used policies in Bayesian optimization because of its simplicity and ability to handle noisy observations.

Bayesian Optimization Model Compression

Confidence-aware Personalized Federated Learning via Variational Expectation Maximization

1 code implementation CVPR 2023 Junyi Zhu, Xingchen Ma, Matthew B. Blaschko

A global model is introduced as a latent variable to augment the joint distribution of clients' parameters and capture the common trends of different clients, optimization is derived based on the principle of maximizing the marginal likelihood and conducted using variational expectation maximization.

Personalized Federated Learning Variational Inference

Meta-Cal: Well-controlled Post-hoc Calibration by Ranking

1 code implementation10 May 2021 Xingchen Ma, Matthew B. Blaschko

In this paper, we introduce two constraints that are worth consideration in designing a calibration map for post-hoc calibration.

Multi-class Classification

Additive Tree-Structured Covariance Function for Conditional Parameter Spaces in Bayesian Optimization

no code implementations21 Jun 2020 Xingchen Ma, Matthew B. Blaschko

Bayesian optimization (BO) is a sample-efficient global optimization algorithm for black-box functions which are expensive to evaluate.

Bayesian Optimization Model Compression +1

Cannot find the paper you are looking for? You can Submit a new open access paper.