Search Results for author: Haibin Yu

Found 7 papers, 2 papers with code

Ad Recommendation in a Collapsed and Entangled World

no code implementations22 Feb 2024 Junwei Pan, Wei Xue, Ximei Wang, Haibin Yu, Xun Liu, Shijie Quan, Xueming Qiu, Dapeng Liu, Lei Xiao, Jie Jiang

In this paper, we present an industry ad recommendation system, paying attention to the challenges and practices of learning appropriate representations.

Feature Correlation Model Optimization

AdaTask: A Task-aware Adaptive Learning Rate Approach to Multi-task Learning

no code implementations28 Nov 2022 Enneng Yang, Junwei Pan, Ximei Wang, Haibin Yu, Li Shen, Xihua Chen, Lei Xiao, Jie Jiang, Guibing Guo

In this paper, we propose to measure the task dominance degree of a parameter by the total updates of each task on this parameter.

Multi-Task Learning Recommendation Systems

On Provably Robust Meta-Bayesian Optimization

1 code implementation14 Jun 2022 Zhongxiang Dai, Yizhou Chen, Haibin Yu, Bryan Kian Hsiang Low, Patrick Jaillet

We prove that both algorithms are asymptotically no-regret even when some or all previous tasks are dissimilar to the current task, and show that RM-GP-UCB enjoys a better theoretical robustness than RM-GP-TS.

Bayesian Optimization Meta-Learning +1

Convolutional Normalizing Flows for Deep Gaussian Processes

no code implementations17 Apr 2021 Haibin Yu, Dapeng Liu, Yizhou Chen, Bryan Kian Hsiang Low, Patrick Jaillet

Deep Gaussian processes (DGPs), a hierarchical composition of GP models, have successfully boosted the expressive power of their single-layer counterpart.

Gaussian Processes Variational Inference

Implicit Posterior Variational Inference for Deep Gaussian Processes

1 code implementation NeurIPS 2019 Haibin Yu, Yizhou Chen, Zhongxiang Dai, Kian Hsiang Low, Patrick Jaillet

This paper presents an implicit posterior variational inference (IPVI) framework for DGPs that can ideally recover an unbiased posterior belief and still preserve time efficiency.

Gaussian Processes Variational Inference

Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression

no code implementations1 Nov 2017 Haibin Yu, Trong Nghia Hoang, Kian Hsiang Low, Patrick Jaillet

This paper presents a novel variational inference framework for deriving a family of Bayesian sparse Gaussian process regression (SGPR) models whose approximations are variationally optimal with respect to the full-rank GPR model enriched with various corresponding correlation structures of the observation noises.

GPR regression +2

Cannot find the paper you are looking for? You can Submit a new open access paper.