Search Results for author: Guangda Huzhang

Found 6 papers, 0 papers with code

Exploit Customer Life-time Value with Memoryless Experiments

no code implementations17 Jan 2022 Zizhao Zhang, Yifei Zhao, Guangda Huzhang

As a measure of the long-term contribution produced by customers in a service or product relationship, life-time value, or LTV, can more comprehensively find the optimal strategy for service delivery.

A General Traffic Shaping Protocol in E-Commerce

no code implementations30 Dec 2021 Chenlin Shen, Guangda Huzhang, YuHang Zhou, Chen Liang, Qing Da

Our algorithm can straightforwardly optimize the linear programming in the prime space, and its solution can be simply applied by a stochastic strategy to fulfill the optimized objective and the constraints in expectation.

Re-ranking With Constraints on Diversified Exposures for Homepage Recommender System

no code implementations12 Dec 2021 Qi Hao, Tianze Luo, Guangda Huzhang

The homepage recommendation on most E-commerce applications places items in a hierarchical manner, where different channels display items in different styles.

Recommendation Systems Re-Ranking

Learning-To-Ensemble by Contextual Rank Aggregation in E-Commerce

no code implementations19 Jul 2021 Xuesi Wang, Guangda Huzhang, Qianying Lin, Qing Da

Combined with the idea of Bayesian Optimization and gradient descent, we solve the online contextual Black-Box Optimization task that finds the optimal weights for sub-models given a chosen RA model.

AliExpress Learning-To-Rank: Maximizing Online Model Performance without Going Online

no code implementations25 Mar 2020 Guangda Huzhang, Zhen-Jia Pang, Yongqing Gao, Yawen Liu, Weijie Shen, Wen-Ji Zhou, Qing Da, An-Xiang Zeng, Han Yu, Yang Yu, Zhi-Hua Zhou

The framework consists of an evaluator that generalizes to evaluate recommendations involving the context, and a generator that maximizes the evaluator score by reinforcement learning, and a discriminator that ensures the generalization of the evaluator.


Cannot find the paper you are looking for? You can Submit a new open access paper.