Search Results for author: Ruofei Ouyang

Found 4 papers, 0 papers with code

Decentralized High-Dimensional Bayesian Optimization with Factor Graphs

no code implementations19 Nov 2017 Trong Nghia Hoang, Quang Minh Hoang, Ruofei Ouyang, Kian Hsiang Low

This paper presents a novel decentralized high-dimensional Bayesian optimization (DEC-HBO) algorithm that, in contrast to existing HBO algorithms, can exploit the interdependent effects of various input components on the output of the unknown objective function f for boosting the BO performance and still preserve scalability in the number of input dimensions without requiring prior knowledge or the existence of a low (effective) dimension of the input space.

Bayesian Optimization Vocal Bursts Intensity Prediction

Gaussian Process Decentralized Data Fusion Meets Transfer Learning in Large-Scale Distributed Cooperative Perception

no code implementations16 Nov 2017 Ruofei Ouyang, Kian Hsiang Low

To achieve this, we propose a novel transfer learning mechanism for a team of agents capable of sharing and transferring information encapsulated in a summary based on a support set to that utilizing a different support set with some loss that can be theoretically bounded and analyzed.

Transfer Learning

Parallel Gaussian Process Regression with Low-Rank Covariance Matrix Approximations

no code implementations9 Aug 2014 Jie Chen, Nannan Cao, Kian Hsiang Low, Ruofei Ouyang, Colin Keng-Yan Tan, Patrick Jaillet

We theoretically guarantee the predictive performances of our proposed parallel GPs to be equivalent to that of some centralized approximate GP regression methods: The computation of their centralized counterparts can be distributed among parallel machines, hence achieving greater time efficiency and scalability.

Gaussian Processes regression

Parallel Gaussian Process Regression with Low-Rank Covariance Matrix Approximations

no code implementations24 May 2013 Jie Chen, Nannan Cao, Kian Hsiang Low, Ruofei Ouyang, Colin Keng-Yan Tan, Patrick Jaillet

We theoretically guarantee the predictive performances of our proposed parallel GPs to be equivalent to that of some centralized approximate GP regression methods: The computation of their centralized counterparts can be distributed among parallel machines, hence achieving greater time efficiency and scalability.

Gaussian Processes regression

Cannot find the paper you are looking for? You can Submit a new open access paper.